Pairs of Random Variables

Size: px
Start display at page:

Download "Pairs of Random Variables"

Transcription

1 LECTURE 4 Pairs of Random Variables 4. TWO RANDOM VARIABLES Let(,F, P)beaprobabl spaceandconsidertwomeasurablemappings X : Ă Ă, Y : Ă Ă. In other words, X andy are two random variables defined on the common probabilit space(,f, P). Inordertospeciftherandomvariables, weneedtodetermine P{(X,Y) A} = P({ω : (X(ω),Y(ω)) A}) foreverborel set A Ă 2. B thepropertiesof theborel σ-field, itcan be shownthatit suffices to determine the probabilities of the form or equivalentl, the probabilities of the form The latter defines their joint cdf P{a < X < b, c < Y < d}, a < b, c < d, P{X, Y },, Ă. F X,Y (, ) = P{X, Y },, Ă, whichistheshadedregioninfigure.. The joint cdf satisfies the following properties:. F X,Y (, ) 0.. If 2 and 2,then. Limits. F X,Y (, ) F X,Y ( 2, 2 ). lim F,Ă X,Y(, ) =, lim Ă F X,Y(, ) = lim Ă F X,Y(, ) = 0.

2 2 Pairs of Random Variables (, ) Figure.. Anillustrationofthejointpdfof X andy.. Marginal cdfs. lim F Ă X,Y(, ) = F X (), lim F Ă X,Y(, ) = F Y (). Theprobabilit ofan(borel)setcanbedeterminedfromthejointcdf. Foreample, P{a < X b, c < Y d} = F X,Y (b, d) F X,Y (a,d) F X,Y (b, c)+f X,Y (a, c) as illustrated in Figure. Wesa that X andy are statisticallindependent or independent in shortif forever d c a b Figure.. Anillustrationof P{a < X b, c <Y d}.

3 4.2 Pairs of Discrete Random Variables 3 (Borel) AandB, P{X A, Y B} = P{X A} P{Y B}. Equivalentl, X and Y are independent if F X,Y (, ) = F X ()F Y (),, Ă. In the following, we focus on three special cases and discuss the joint, marginal, and conditional distributions of X and Y for each case. Ȃ X andy arediscrete. Ȃ X andy arecontinuous. Ȃ X isdiscreteandy iscontinuous (mied). 4.2 PAIRS OF DISCRETE RANDOM VARIABLES Let X andy be discrete randomvariables onthesame probabilit space. Theare completel specified b their joint pmf B the aioms of probabilit, p X,Y (, ) = P{X =,Y = }, X, Y. X p X,Y (, ) =. Y Weusethelawoftotalprobabilit tofindthemarginalpmf of X: p X () = p(, ), X. Y Theconditionalpmf of X giveny = isdefinedas p X Y ( ) = p X,Y(, ), p p Y () Y () = 0, X. Checkthatif p Y () = 0,then p X Y ( )isapmffor X. Eample.. Consider the pmf p(, ) described b the following table

4 4 Pairs of Random Variables Then, and /4 = 0, p X () = 3/8 =, 3/8 = 2.5, /2 = 0, p X Y ( 2) = /2 =. Chainrule. p X,Y (, ) = p X ()p Y X ( ) = p Y ()p X Y ( ). Independence. X and Y are independent if which is equivalent to Lawoftotalprobabilit.Foranevent A, p X,Y (, ) = p X ()p Y (), X, Y, p X Y ( ) = p X (), X, p Y () = 0. P(A) = p X () P(A X = ). Baesrule. Given p X () and p Y X ( ) for ever (, ) X Y, we can find p X Y ( ) as p X Y ( ) = p X,Y(, ) p Y () = p X()p Y X ( ) p Y () = p X()p Y X ( ) u X p X,Y (u, ) p X ()p Y X ( ) = p X (u)p Y X ( u). u X Thefinalformulaisentirelintermsoftheknownquantities p X ()and p Y X ( ). Eample. (Binar smmetric channel). Consider the binar communication channelinfigure.. Thebitsentis X Bern(p), p [0,],andthebitreceived is Y = (X + Z) mod 2 = X Z, wherethenoisez Bern(є), є [0,/2],isindependentofX. Wefind p X Y ( ), p Y (),

5 4.2 Pairs of Discrete Random Variables 5 Z {0,} X {0,} Y {0,} Figure.. Binar smmetric channel. and P{X = Y},namel,theprobabilit oferror. First,weusetheBaesrule p Y X ( ) p X Y ( ) = p Y X ( u)p X (u) p X(). u X Weknow p X (),butweneedtofind p Y X ( ): Therefore p Y X ( ) = P{Y = X = } = P{X Z = X = } = P{ Z = X = } = P{Z = X = } = P{Z = } since Z and X areindependent = p Z ( ). p Y X (0 0) = p Z (0 0) = p Z (0) = є, p Y X (0 ) = p Z (0 ) = p Z () = є, p Y X ( 0) = p Z ( 0) = p Z () = є, p Y X ( ) = p Z ( ) = p Z (0) = є. Plugging into the Baes rule equation, we obtain p X Y (0 0) = p Y X (0 0) p Y X (0 0)p X (0)+ p Y X (0 )p X () p X(0) = єp p X Y ( 0) = p X Y (0 0) = ( є)( p)+ єp. p X Y (0 ) = p Y X ( 0) p Y X ( 0)p X (0)+ p Y X ( )p X () p X(0) = p X Y ( ) = p X Y (0 ) = Wealreadfound p Y ()as ( є)p ( є)p+є( p). p Y () = p Y X ( 0)p X (0)+ p Y X ( )p X () = ( є)( p)+ єp for = 0, є( p)+( є)p for =. ( є)( p) ( є)( p)+єp. є( p) ( є)p+є( p).

6 6 Pairs of Random Variables Nowtofindtheprobabilit oferror P{X = Y},consider P{X = Y} = p X,Y (0,)+ p X,Y (,0) = p Y X ( 0)p X (0)+ p Y X (0 )p X () = є( p)+ єp = є. Alternativel, P{X = Y} = P{Z = } = є. An interesting special case is є = /2, whence P{X = Y} = /2,whichistheworstpossible(noinformationissent),and p Y (0) = 2 p+ 2 ( p) = 2 = p Y(). ThereforeY Bern(/2),regardlessofthevalue of p. Inthiscase,thebitsent X andthe bit received Y are independent(check!). 4.3 PAIRS OF CONTINUOUS RANDOM VARIABLES 4.3. Joint and Marginal Densities WesathatXandY arejointlcontinuousrandomvariablesiftheirjointcdfiscontinuous in both and. In thiscase, we can define theirjoint pdf, provided that it eists, as the function f X,Y (, )suchthat F X,Y (, ) = IfF X,Y (, )isdifferentiable in and,then f X,Y (u, )dud,, Ă. f X,Y (, ) = 2 F(, ) F X,Y ( +, + ) F X,Y ( +, ) F X,Y (, + )+F X,Y (, ) = lim, Ă0 P{ < X +, < Y + } = lim., Ă0 Thejointpdf f X,Y (, )satisfiesthefollowingproperties:. f X,Y (, ) 0.. f X,Y (, )d d =.. TheprobabilitofansetA Ă 2 canbecalculatedbintegratingthejointpdfovera: P{(X,Y) A} = f X,Y (, )d d. (,) A

7 4.3 Pairs of Continuous Random Variables 7 The marginalpdf of X can be obtained from thejoint pdf via thelaw of total probabilit: f X () = f X,Y (, )d. Toseethis,recallthat and consider д() = d d F X () = lim F Ă X,Y (, ) = д(u) du f X,Y (u, )ddu with д(u) = f X,Y(u, )d. We then differentiate F X () with respect to. Alternativel, we can consider the following figure + and note that P{ < X + } f X () = lim Ă0 = lim Ă0 lim Ă0 = lim Ă0 = n= f X,Y (, )d. P{ < X +, n < Y (n+) } f X,Y (, )d Eample.. Let(X,Y) f(, ),where c 0, 0, +, f(, ) = 0 otherwise. Wefind c, f Y (),and P{Y 2X}. Notefirstthat = f(, )d d = 0 0 cd d = c ( )d = 2 c. 0

8 8 Pairs of Random Variables Hence, c = 2. Tofind f Y (),weusethelawoftotalprobabilit f Y () = f(, )d = ( ) 2d 0, 0 0 otherwise 2( ) 0, = 0 otherwise. f Y () Tofindtheprobabilit oftheset{y 2X},wefirstsketchtheset 2 3 = 2 Fromthefigurewefindthat P X 2 Y = {(,): 2 } f X,Y(, )d d = ( ) d d = RecallthatXandYareindependentif f X,Y (, )= f X ()f Y ()forever,.but f X,Y (0,)= 2and f X (0)f Y () = 0,so X andy arenot independent.alternativel, consider toseethat X andy aredependent. f X,Y (/2,/2) = 2 = = f X (/2)f Y (/2)

9 4.3 Pairs of Continuous Random Variables Conditional Densities LetX andy becontinuousrandomvariableswithjointpdf f X,Y (, ). Wewishtodefine F Y X ( X = ) = P{Y X = }. We cannot define the above conditional probabilit as P{Y, X = } P{X = } since both the numerator and the denominator are equal to zero. Instead, we define the conditional probabilit for continuous random variables as a limit F Y X ( ) = lim Ă0 P{Y < X + } P{ < X +, } = lim Ă0 P{ < X + } + Ă0 = lim Ă0 = lim = f X,Y (u, )dud f X () f X,Y(, ) d f X () f X,Y (, ) d. f X () B differentiating w.r.t., we thus define the conditional pdf as f Y X ( ) = f X,Y(, ) f X () if f X () = 0. Itcanbereadil checkedthat f Y X ( )isavalid pdfand F Y X ( ) = f Y X (u )du isavalidcdfforever suchthat f X () > 0. Sometimesweusethenotation Y {X = } f Y X ( ) todenotey hasaconditionalpdf f Y X ( )given X =. Chainrule. f X,Y (, ) = f X ()f Y X ( ) = f Y ()f X Y ( ). Independence. X and Y are independent iff f X,Y (, ) = f X ()f Y (),, Ă, or equivalentl, f X Y ( ) = f X (), Ă, f Y () = 0.

10 0 Pairs of Random Variables Lawoftotalprobabilit.Foranevent A, P(A) = f X () P(A X = )d. Baesrule. Given f X ()and f Y X ( ),wecanfind f X Y ( ) = f Y X( ) f Y () f X () = f X()f Y X ( ) f X,Y(u, )du = f X ()f Y X ( ) f X(u)f Y X ( u)du. Eample.. AsinEample.,let f(, )bedefinedas 2 0, 0, +, f(, ) = 0 otherwise. We alread know that 2( ) 0, f Y () = 0 otherwise. Therefore f X Y ( ) = f X,Y(, ) f Y () = 0 <, 0, 0 otherwise. Inotherwords, X {Y = } Unif[0, ]. f X Y ( )

11 4.4 Pairs of Mied Random Variables Eample.. LetΛ Unif[0,]andtheconditionalpdfof X given{λ = λ}be f X Λ ( λ) = λe λ, 0 < λ, i.e., X {Λ = λ} Ep(λ). Then,btheBaesrule, f Λ X (λ 3) = f X Λ (3 λ)f Λ (λ) 0 f Λ(u)f X Λ (3 u)du λe 3λ = 0 < λ, 9 ( 4e 3 ) 0 otherwise. f Λ (λ) 0 f Λ X (λ 3) λ λ 4.4 PAIRS OF MIXED RANDOM VARIABLES Let X be a discrete random variable withpmf p X (). For each with p X () = 0, conditionedontheevent{x = },lety beacontinuousrandomvariable withconditionalcdf F Y X ( )andconditionalpdf f Y X ( ) = F Y X( ), provided that the derivative is well-defined. Then, b the law of total probabilit, F Y () = P{Y } = P{Y X = }p X () = F Y X ( )p X () = f Y X ( )d p X (),

12 2 Pairs of Random Variables and hence f Y () = d d F Y() = f Y X ( )p X (). Theconditionalpmfof X giveny = canbedefinedasalimit: P{X =, < Y + } p X Y ( ) = lim Ă0 P{ < Y + } p = lim X () P{ < Y + X = } Ă0 P{ < Y + } p X ()f Y X ( ) = lim Ă0 f Y () = f Y X( ) p f Y () X (). Chainrule. p X ()f Y X ( ) = f Y ()p X Y ( ). Baesrule. Given p X ()and f Y X ( ), p X Y ( ) = p X()f Y X ( ) u p X (u)f Y X ( u). Eample. (Additive Gaussian noise channel). Consider the communication channel in Figure.. The signal transmitted is a binar random variable X = + P w.p. p, P w.p. p. The received signal, also called the observation, is Y = X + Z, where Z N(0,N) is additive noiseand independentof X. First note thaty {X = } Z N(0,N) X Y Figure.. Additive Gaussian noise channel.

13 4.5 Signal Detection 3 N(,N). Toseethis,consider P{Y X = } = P{X + Z X = } = P{Z X = } = P{Z }, which, b taking derivative w.r.t., implies that In other words, f Y X ( ) = f Z ( ) = ( ) 2 2πN e 2. Y {X = + P} N(+ P,N) and Y {X = P} N( P,N). Net b the law of total probabilit, Finall, b the Baes rule, f Y () = p X (+ P)f Y X ( + P)+ p X ( P)f Y X ( P) = p 2πN e 2 +( p) 2πN e ( ) 2 ( + ) 2 2. and p X Y (+ P ) = = p ( ) 2 2πN e 2 p e ( )2 ( p) 2 + e ( + )2 2 2πN 2πN pe pe +( p)e p X Y ( P ) = ( p)e pe +( p)e. 4.5 SIGNAL DETECTION Consider the general digital communication sstem depicted in Figure., where the signal sent is X p X (), X = {, 2,..., n }, and theobservation (received signal) is Y {X = } f Y X ( ). A detector is a mapping d : Ă Ă X that generates an estimate X = d(y) of the input signal X. We wish to find an optimal detector d () that minimizes the probabilit of error P e := P{ X = X} = P{d(Y) = X}.

14 4 Pairs of Random Variables X {, 2,..., n } nois Y channel detector X {, 2,..., n } f Y X ( ) d() Figure.. The general digital communication sstem. Supposethatthereisnoobservationandwewouldliketofindthebestguessd of X, that is, d = argmin d P{d = X}. Clearl, the best guess is the one with the largest probabilit, that is, d = argma p X (). (. ) Thissimple observation continues toholdwiththeobservationy =. Definethemaimum a posteriori probabilit(map) detector as d () = argma p X Y ( ). (. ) Then, b(. ) P{d () = X Y = } P{d() = X Y = } for an detector d(). Consequentl, P{d (Y) = X} = P{d () = X Y = } P{d() = X Y = } = P{d(Y) = X}, andthemapdetectorminimizes theprobabilit oferrorp e. WhenX isuniformon{, 2,..., n },thatis, p X ( ) = p X ( 2 ) = = p X ( n ) = /n, p X Y ( ) = p X()f Y X ( ) f Y () = f Y X( ) nf Y (), whichdependson onl throughthelikelihood f Y X ( )foragiven. Then,theMAP detection rule in(. ) simplifies as the maimum likelihood(ml) detection rule d () = argma f Y X ( ).

15 4.5 Signal Detection 5 Eample.. We revisit the additive Gaussian noise channel in Eample. with signal X = + P w.p. 2 P w.p. 2 independentnoise Z N(0,N), andoutputy = X + Z. TheMAPdetectoris d () = + P if p X Y (+ P ) > p X Y ( P ), P otherwise. Sincethetwosignalsareequall likel,themapdetectionrulereducestothemldetetion rule d () = + P if f Y X ( + P) > f Y X ( P), P otherwise. f( P) f( + P) P + P Fromthefigureabove andusingthegaussian pdf,themap/mldetectorreduces tothe minimum distance detector d () = + P ( P) 2 < ( ( P)) 2, P otherwise, which further simplifies to d () = + P > 0, P 0. Now the minimum probabilit of error is P e = P{d (Y) = X} = P{X = P} P{d (Y) = P X = P} + P{X = P} P{d (Y) = P X = P} = 2 P{Y 0 X = P}+ 2 P{Y > 0 X = P} = 2 P{Z P}+ 2 P{Z > P} = Q P N. Note that the probabilit of error is a decreasing function of P/N, which is often called the signal-to-noise ratio(snr).

16 6 Pairs of Random Variables 4.6 FUNCTIONS OF TWO RANDOM VARIABLES Let (X,Y) f(, ) and let д(, ) be a differentiable function. To find the pdf of Z = д(x,y),wefirstfindtheinverseimageof{z z}tocomputeitsprobabilitepressedas afunctionof z,i.e.,f Z (z),andthentakethederivative. Eample.. Suppose that X f X () andy f Y () are independent. Let Z = X +Y. Then F Z (z) = P{Z z} = P{X +Y z} = = = = Taking derivative w.r.t. z, we have P{X +Y z X = }f X ()d P{ +Y z X = }f X ()d P{Y z }f X ()d F Y (z )f X ()d. f Z (z) = f Y (z )f X ()d, which is the convolution of f X () and f Y (). For eample, if X N( X,σ 2 X ) and Y N( Y,σ 2 Y )areindependent,thenitcanbereadil checkedthat Z = X +Y N( X + Y,σ 2 X + σ2 Y ). A similar result also holds for the sum of two independent discrete random variables (replacing pdfs with pmfs and integrals with sums). For eample, if X Poisson(λ ) and Y Poisson(λ 2 ) are independent, then Z = X +Y Poisson(λ ) Poisson(λ 2 ) = Poisson(λ + λ 2 ). The propert that thesumof two independentrandom variables with the same distribution has the same distribution, which is obeed b Gaussian and Poisson random variables, is referred to as infinite divisibilit. For eample, a Poisson(λ) r.v. can bewrittenasthesumofannumberofindependentpoisson(λ i )r.v.s,aslongas i λ i = λ. Itissometimes easiertoworkwithcdfsfirsttofindthepdfofafunctionof X andy (especiall when д(, ) is not differentiable). Eample. (Minimumandmaimum). LetX f X ()andy f Y ()beindependent. Define U = ma{x, Y} and V = min{x, Y}.

17 Problems 7 TofindthepdfofU,wefirstfinditscdf F U (u) = P{U u} = P{X u, Y u} = F X (u)f Y (u). Using the product rule for derivatives, f U (u) = f X (u)f Y (u) + f Y (u)f X (u). NowtofindthepdfofV,consider P{V > } = P{X >, Y > } F V ( ) = ( F X ( ))( F Y ( )). Thus f V ( ) = f X ( )+ f Y ( ) f X ( )F Y ( ) f Y ( )F X ( ). Moregenerall,thejointpdfof(U,V)canbefoundbsimilararguments;seeProblem??. PROBLEMS.. Geometric with conditions. Let X be a geometric random variable with pmf p X (k) = p( p) k, k =,2,... Findandplottheconditionalpmf p X (k A) = P{X = k X A}if: (a) A = {X > m}wheremisapositiveinteger. (b) A = {X < m}. (c) A = {X isanevennumber}. Commentontheshapeoftheconditionalpmfofpart(a)... Conditional cdf. Let A be a nonzero probabilit event A. Show that (a) P(A) = P(A X )F X () + P(A X > )( F X ()). P(A X ) (b)f X ( A) = F P(A) X ()... Joint cdf or not. Consider the function G(, ) = if + 0, 0 otherwise. CanG beajointcdfforapairofrandomvariables? Justifouranswer... Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time(0, t]. Suppose N(t) is Poisson with pmf p N (n) = (λt)n e λt forn = 0,,2,... n! LettherandomvariableY bethetimetogetthen-thpacket. FindthepdfofY.

18 8 Pairs of Random Variables.. Diamond distribution. Consider the random variables X and Y with the joint pdf where c isaconstant. (a) Find c. (b)find f X ()and f X Y ( ). f X,Y (, ) = c, if + / 2, 0, otherwise, (c) Are X and Y independent random variables? Justif our answer. (d)definetherandomvariable Z = ( X + Y ). Findthepdf f Z (z)... Coin with random bias. You are given acoin but are nottold whatits bias (probabilit of heads) is. You are told instead that the bias is the outcome of a random variable P U[0,]. To get more information about the coin bias, ou flip it independentl times. Let X be the number of heads ou get. Thus X Binom(0,P). Assuming that X = 9, find and sketch the a posteriori probabilit ofp,i.e., f P X (p 9)... First available teller. Consider a bank with two tellers. The service times for the tellers are independenteponentiall distributed randomvariables X Ep(λ ) and X 2 Ep(λ 2 ), respectivel. You arrive at the bank and find that both tellers are bus but that nobod else is waiting to be served. You are served b the first available teller once he/she is free. (a) Whatistheprobabilit thatouareservedbthefirstteller? (b)lettherandomvariabley denoteourwaitingtime. FindthepdfofY... Optical communication channel. Let the signal input to an optical channel be given b X = withprobabilit 2 0 withprobabilit 2. TheconditionalpmfoftheoutputofthechannelY {X = } Poisson(),i.e.,Poissonwithintensit λ =,andy {X = 0} Poisson(0). (a) ShowthattheMAPrulereducesto D() =, < 0, otherwise. (b)find andthecorrespondingprobabilit oferror... Iocane or Sennari. An absent-minded chemistr professor forgets to label two identicall looking bottles. One bottle contains a chemical named Iocane and the other bottle contains a chemical named Sennari. It is well known that the radioactivit level of Iocane has the U[0, ] distribution, while the radioactivit level of Sennari has the Ep() distribution.

19 Problems 9 (a) Let X be the radioactivit level measured from one of the bottles. What is the optimal decision rule (based on the measurement X) that maimizes the chance of correctl identifing the content of the bottle? (b) What is the associated probabilit of error?.. Independence. Let X X and Y Y be two independent discrete random variables. (a) Showthatantwoevents A X andb Y areindependent. (b)showthatantwofunctionsof X andy separatel areindependent;thatis,if U = д(x) andv = h(y)thenu andv areindependent... Familplanning.AliceandBobchooseanumberX atrandomfromtheset{2,3,4} (so the outcomes are equall probable). If the outcome is X =, the decide to have children until the have a girl or children, whichever comes first. Assume that each child is a girl with probabilit / (independent of the number of children andgenderofotherchildren). LetY bethenumberofchildrenthewillhave. (a) Findtheconditionalpmf p Y X ( )forall possiblevaluesof and. (b)findthepmfofy... Radarsignaldetection. ThesignalforaradarchannelS = 0ifthereisnotargetand arandomvariables N(0,P)ifthereisatarget. Bothoccurwithequalprobabilit. Thus S = 0, withprobabilit 2 X N(0,P), withprobabilit 2. The radar receiver observesy = S + Z, wherethenoise Z N(0,N) is independent of S. Find theoptimal decoder for deciding whethers = 0 or S = X and its probabilit of error? Provide our answer in terms of intervals of and provide theboundarpointsoftheintervalsintermsofp andn... Ternar signaling. Let the signal S be a random variable defined as follows: withprobabilit 3 S = 0 withprobabilit 3 + withprobabilit. 3 ThesignalissentoverachannelwithadditiveLaplaciannoiseZ,i.e., Z isalaplacian random variable with pdf f Z (z) = λ 2 e λ z, < z <. ThesignalSandthenoiseZareassumedtobeindependentandthechanneloutput istheirsumy = S + Z.

20 20 Pairs of Random Variables (a) Find f Y S ( s) for s =, 0, +. Sketch the conditional pdfs on the same graph. (b)find the optimal decoding rule D(Y) for deciding whether S is, 0 or +. Give ouranswerintermsofrangesofvaluesofy. (c) Findtheprobabilit ofdecodingerrorford()intermsof λ... Signal or no signal. Consider a communication sstem that is operated onl from time to time. When the communication sstem is in the normal mode(denoted b M = ),ittransmitsarandomsignals = X with +, with probabilit /2, X =, with probabilit /2. When thesstem is in the idle mode (denoted b M = 0), it does not transmit ansignal(s = 0). Bothnormalandidlemodesoccurwithequalprobabilit. Thus X, withprobabilit /2, S = 0, with probabilit /2. The receiver observes Y = S + Z, where the ambient noise Z U[,] is independent of S. (a) Find and sketch the conditional pdf f Y M ( ) of the receiver observation Y giventhatthesstemisinthenormalmode. (b)find and sketch the conditional pdf f Y M ( 0) of the receiver observation Y giventhatthesstemisintheidlemode. (c) Findtheoptimaldecoder d ()fordecidingwhetherthesstemisnormalor idle. Provide theanswerintermsofintervalsof. (d) Find the associated probabilit of error... Two independent uniform random variables. Let X and Y be independentl and uniforml drawn from the interval[0, ]. (a) FindthepdfofU = ma(x,y). (b)findthepdfofv = min(x,y). (c) FindthepdfofW =U V. (d)findthepdfof Z = (X +Y) mod. (e) Findtheprobabilit P{ X Y /2}... Two independent Gaussian random variables. Let X and Y be independent Gaussian random variables, both with zero mean and unit variance. Find the pdf of X Y.

21 Problems 2.. Functions of eponential random variables. Let X and Y be independent eponentiall distributed random variables with the same parameter λ. Define the followingthreefunctionsof X andy: U = ma(x,y), V = min(x,y), W =U V. (a) FindthejointpdfofU andv. (b)findthejointpdfofv andw. Aretheindependent? Hint:Youcansolvepart(b)eitherdirectlbfindingthejointcdforbepressing thejointpdfintermsof f U,V (u, )andusingtheresultofpart(a)... Maimalcorrelation.Let(X,Y) F X,Y (, )beapairofrandomvariables. (a) Show that F X,Y (, ) min{f X (),F Y ()}. NowletF()andG()becontinuous andinvertible cdfsandlet X F(). (b)findthecdfof Y = G (F(X)). (c) Show that F X,Y (, ) = min{f(), G()}.

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N

More information

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Random Variables LECTURE DEFINITION

Random Variables LECTURE DEFINITION LECTURE 3 Random Variables 3. DEFINITION It is often convenient to represent the outcome of a random eperiment by a number. A random variable(r.v.) is such a representation. To be more precise, let (Ω,F,

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

Summary of Random Variable Concepts March 17, 2000

Summary of Random Variable Concepts March 17, 2000 Summar of Random Variable Concepts March 17, 2000 This is a list of important concepts we have covered, rather than a review that devives or eplains them. Tpes of random variables discrete A random variable

More information

EE 302 Division 1. Homework 6 Solutions.

EE 302 Division 1. Homework 6 Solutions. EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the

More information

Expected value of r.v. s

Expected value of r.v. s 10 Epected value of r.v. s CDF or PDF are complete (probabilistic) descriptions of the behavior of a random variable. Sometimes we are interested in less information; in a partial characterization. 8 i

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line Random Variable Random variable is a mapping that maps each outcome s in the sample space to a unique real number, <

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line Random Variable Random variable is a mapping that maps each outcome s in the sample space to a unique real number,. s s : outcome Sample Space Real Line Eamples Toss a coin. Define the random variable

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Recall Discrete Distribution. 5.2 Continuous Random Variable. A probability histogram. Density Function 3/27/2012

Recall Discrete Distribution. 5.2 Continuous Random Variable. A probability histogram. Density Function 3/27/2012 3/7/ Recall Discrete Distribution 5. Continuous Random Variable For a discrete distribution, for eample Binomial distribution with n=5, and p=.4, the probabilit distribution is f().7776.59.3456.34.768.4.3

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

Chapter 2: The Random Variable

Chapter 2: The Random Variable Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Review of Elementary Probability Lecture I Hamid R. Rabiee

Review of Elementary Probability Lecture I Hamid R. Rabiee Stochastic Processes Review o Elementar Probabilit Lecture I Hamid R. Rabiee Outline Histor/Philosoph Random Variables Densit/Distribution Functions Joint/Conditional Distributions Correlation Important

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work.

0.24 adults 2. (c) Prove that, regardless of the possible values of and, the covariance between X and Y is equal to zero. Show all work. 1 A socioeconomic stud analzes two discrete random variables in a certain population of households = number of adult residents and = number of child residents It is found that their joint probabilit mass

More information

Probability review. September 11, Stoch. Systems Analysis Introduction 1

Probability review. September 11, Stoch. Systems Analysis Introduction 1 Probability review Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ September 11, 2015 Stoch.

More information

Problem Solutions Chapter 4

Problem Solutions Chapter 4 Problem Solutions Chapter 4 Problem 4.. Solution (a) The probability P [, 3] can be found be evaluating the joint CDF F, (x, y) at x andy 3. This yields P [, 3] F, (, 3) ( e )( e 3 ) () (b) To find the

More information

3. Several Random Variables

3. Several Random Variables . Several Random Variables. Two Random Variables. Conditional Probabilit--Revisited. Statistical Independence.4 Correlation between Random Variables. Densit unction o the Sum o Two Random Variables. Probabilit

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions

CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. 3.1 Concept of a Random Variable. 3.2 Discrete Probability Distributions CHAPTER 3 RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS 3.1 Concept of a Random Variable Random Variable A random variable is a function that associates a real number with each element in the sample space.

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002 EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model

More information

2. This problem is very easy if you remember the exponential cdf; i.e., 0, y 0 F Y (y) = = ln(0.5) = ln = ln 2 = φ 0.5 = β ln 2.

2. This problem is very easy if you remember the exponential cdf; i.e., 0, y 0 F Y (y) = = ln(0.5) = ln = ln 2 = φ 0.5 = β ln 2. FALL 8 FINAL EXAM SOLUTIONS. Use the basic counting rule. There are n = number of was to choose 5 white balls from 7 = n = number of was to choose ellow ball from 5 = ( ) 7 5 ( ) 5. there are n n = ( )(

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

MAS1302 Computational Probability and Statistics

MAS1302 Computational Probability and Statistics MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

Section 1.2: A Catalog of Functions

Section 1.2: A Catalog of Functions Section 1.: A Catalog of Functions As we discussed in the last section, in the sciences, we often tr to find an equation which models some given phenomenon in the real world - for eample, temperature as

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

EE 505 Introduction. What do we mean by random with respect to variables and signals?

EE 505 Introduction. What do we mean by random with respect to variables and signals? EE 505 Introduction What do we mean by random with respect to variables and signals? unpredictable from the perspective of the observer information bearing signal (e.g. speech) phenomena that are not under

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1 CCE 40: Communication Systems Summer 206-207 Assignment 9 Due: July 4, 207 Instructor: Dr. Mustafa El-Halabi Instructions: You are strongly encouraged to type out your solutions using mathematical mode

More information

Lecture 4: Random Variables and Distributions

Lecture 4: Random Variables and Distributions Lecture 4: Random Variables and Distributions Goals Random Variables Overview of discrete and continuous distributions important in genetics/genomics Working with distributions in R Random Variables A

More information

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Random Vectors. 1 Joint distribution of a random vector. 1 Joint distribution of a random vector

Random Vectors. 1 Joint distribution of a random vector. 1 Joint distribution of a random vector Random Vectors Joint distribution of a random vector Joint distributionof of a random vector Marginal and conditional distributions Previousl, we studied probabilit distributions of a random variable.

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali Stochastic Processes Review o Elementary Probability bili Lecture I Hamid R. Rabiee Ali Jalali Outline History/Philosophy Random Variables Density/Distribution Functions Joint/Conditional Distributions

More information

Special distributions

Special distributions Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions

More information

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables: CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same

More information

Expected Values, Exponential and Gamma Distributions

Expected Values, Exponential and Gamma Distributions Expected Values, Exponential and Gamma Distributions Sections 5.2 & 5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13-3339 Cathy

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

INF Introduction to classifiction Anne Solberg Based on Chapter 2 ( ) in Duda and Hart: Pattern Classification

INF Introduction to classifiction Anne Solberg Based on Chapter 2 ( ) in Duda and Hart: Pattern Classification INF 4300 151014 Introduction to classifiction Anne Solberg anne@ifiuiono Based on Chapter 1-6 in Duda and Hart: Pattern Classification 151014 INF 4300 1 Introduction to classification One of the most challenging

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Exercise 1. = P(y a 1)P(a 1 )

Exercise 1. = P(y a 1)P(a 1 ) Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a

More information

Exam 1. Problem 1: True or false

Exam 1. Problem 1: True or false Exam 1 Problem 1: True or false We are told that events A and B are conditionally independent, given a third event C, and that P(B C) > 0. For each one of the following statements, decide whether the statement

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables EE5110: Probability Foundations for Electrical Engineers July-November 015 Lecture 13: Conditional Distributions and Joint Continuity Lecturer: Dr. Krishna Jagannathan Scribe: Subrahmanya Swamy P 13.1

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Transform Techniques - CF

Transform Techniques - CF Transform Techniques - CF [eview] Moment Generating Function For a real t, the MGF of the random variable is t t M () t E[ e ] e Characteristic Function (CF) k t k For a real ω, the characteristic function

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Review of Probability

Review of Probability Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical

More information

Probability Review. Chao Lan

Probability Review. Chao Lan Probability Review Chao Lan Let s start with a single random variable Random Experiment A random experiment has three elements 1. sample space Ω: set of all possible outcomes e.g.,ω={1,2,3,4,5,6} 2. event

More information

EXAM # 3 PLEASE SHOW ALL WORK!

EXAM # 3 PLEASE SHOW ALL WORK! Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown 9.07 Introduction to Probabilit and Statistics for Brain and Cognitive Sciences Emer N. Brown I. Objectives Lecture 4: Transformations of Random Variables, Joint Distributions of Random Variables A. Understand

More information