A Course Material on. Probability and Random Processes

Size: px
Start display at page:

Download "A Course Material on. Probability and Random Processes"

Transcription

1 A Course Material on Probability and Random Processes By Mrs. V.Sumathi ASSISTANT PROFESSOR DEPARTMENT OF SCIENCE AND HUMANITIES SASURIE COLLEGE OF ENGINEERING VIJAYAMANGALAM

2 QUALITY CERTIFICATE This is to certify that the e-course material Subject Code : MA645 Subject Class : Probability and Random Processes : II Year ECE being prepared by me and it meets the knowledge requirement of the university curriculum. Name: V. Sumathi Designation: AP Signature of the Author This is to certify that the course material being prepared by Mss.V.Sumathi is of adequate quality. He has referred more than five books amont them minimum one is from aborad author. Signature of HD Name: Mr.N.Ramkumar SEAL

3 S.NO CONTENTS Page.NO UNIT I RANDOM VARIABLES. Introduction. Discrete Random Variables 3 Continuous Random Variables 5 4 Moments 4 5 Moment generating functions 4 6 Binomial distribution 8 7 Poisson distribution 8 Geometric distribution 5 9 Uniform distribution 7 Exponential distribution 9 Gamma distribution 3 UNIT II TWO DIMENSIONAL RANDOM VARIABLES Introduction 37 Joint distribution 37 3 Marginal and Conditional Distribution 38 4 Covariance 43 5 Correlation Coefficient 44 6 Problems 4 7 Linear Regression 45 8 Transformation of random variables 46 9 Problems 47 UNIT III RANDOM PROCESSES Introduction 49 Classification 5 stationary processes 5 3 Markov processes 55 4 Poisson processes 56 5 Random Telegraph processes 57 UNIT IV CORRELATION AND SPECTRAL DENSITIES 6 Introduction 6 7 Auto Correlation functions 6 8 Properties 6 9 Cross Correlation functions 63 3 Properties 64 3 Power spectral density 65

4 3 Properties Cross spectral density Properties 67 UNIT V LINER SYSTEMS WITH RANDOM INPUTS 35 Introduction 7 36 Linear time invariant systems 7 37 Problems 7 38 Linear systems with random inputs Auto Correlation and Cross Correlation functions of inputs and 74 outputs 4 System transfer function 75 4 Problems 76

5 MA645 L T P C 3 4 OBJECTIVES: To provide necessary basic concepts in probability and random processes for applications such as random signals, linear systems etc in communication engineering. UNIT I RANDOM VARIABLES 9+3 Discrete and continuous random variables Moments Moment generating functions Binomial, Poisson, Geometric, Uniform, Exponential, Gamma and Normal distributions. UNIT II TWO - DIMENSIONAL RANDOM VARIABLES 9+3 Joint distributions Marginal and conditional distributions Covariance Correlation and Linear regression Transformation of random variables. UNIT III RANDOM PROCESSES 9+3 Classification Stationary process Markov process - Poisson process Random telegraph process. UNIT IV CORRELATION AND SPECTRAL DENSITIES 9+3 Auto correlation functions Cross correlation functions Properties Power spectral density Cross spectral density Properties. UNIT V LINEAR SYSTEMS WITH RANDOM INPUTS 9+3 Linear time invariant system System transfer function Linear systems with random inputs Auto correlation and Cross correlation functions of input and output. TOTAL (L:45+T:5): 6 PERIODS OUTCOMES: The students will have an exposure of various distribution functions and help in acquiring skills in handling situations involving more than one variable. Able to analyze the response of random inputs to linear time invariant systems. TEXT BOOKS:. Ibe.O.C., Fundamentals of Applied Probability and Random Processes", Elsevier, st Indian Reprint, 7.. Peebles. P.Z., "Probability, Random Variables and Random Signal Principles", Tata McGraw Hill, 4th Edition, New Delhi,. REFERENCES:. Yates. R.D. and Goodman.D.J., "Probability and Stochastic Processes", nd Edition, Wiley India Pvt. Ltd., Bangalore,.. Stark. H., and Woods. J.W., "Probability and Random Processes with Applications to Signal Processing", 3rd Edition,Pearson Education, Asia,. 3. Miller. S.L. and Childers.D.G., "Probability and Random Processes with Applications to Signal Processing and Communications", Academic Press, Hwei Hsu, "Schaum s Outline of Theory and Problems of Probability, Random Variables and Random Processes", Tata McGraw Hill Edition, New Delhi, Cooper. G.R., McGillem. C.D., "Probabilistic Methods of Signal and System Analysis", 3rd Indian Edition, Oxford University Press, New Delhi,.

6 MA645 UNIT - I RANDOM VARIABLES Introduction Consider an experiment of throwing a coin twice. The outcomes {HH, HT, TH, TT} consider the sample space. Each of these outcome can be associated with a number by specifying a rule of association with a number by specifying a rule of association (eg. The number of heads). Such a rule of association is called a random variable. We denote a random variable by the capital letter (X, Y, etc) and any particular value of the random variable by x and y. Thus a random variable X can be considered as a function that maps all elements in the sample space S into points on the real line. The notation X(S)x means that x is the value associated with the outcomes S by the Random variable X.. SAMPLE SPACE Consider an experiment of throwing a coin twice. The outcomes S {HH, HT, TH, TT} constitute the sample space.. RANDOM VARIABLE In this sample space each of these outcomes can be associated with a number by specifying a rule of association. Such a rule of association is called a random variables. Eg : Number of heads We denote random variable by the letter (X, Y, etc) and any particular value of the random variable by x or y. S {HH, HT, TH, TT} X(S) {,,, } Thus a random X can be the considered as a fun. That maps all elements in the sample space S into points on the real line. The notation X(S) x means that x is the value associated with outcome s by the R.V.X. Example In the experiment of throwing a coin twice the sample space S is S {HH,HT,TH,TT}. Let X be a random variable chosen such that X(S) x (the number of heads). Note Any random variable whose only possible values are and is called a Bernoulli random variable... DISCRETE RANDOM VARIABLE Definition : A discrete random variable is a R.V.X whose possible values consitute finite set of values or countably infinite set of values. Examples

7 MA645 All the R.V. s from Example : are discrete R.V s Remark The meaning of P(X a). P(X a) is simply the probability of the set of outcomes S in the sample space for which X(s) a. Or P(X a) P{S : X(S) a} In the above example : we should write P(X ) P(HH, HT, TH) 3 4 Here P(X ) 3 means the probability of the R.V.X (the number of heads) is less than or equal 4 to is 3 4. Distribution function of the random variable X or cumulative distribution of the random variable X Def : The distribution function of a random variable X defined in (-, ) is given by F(x) P(X x) P{s : X(s) x} Note Let the random variable X takes values x, x,.., x n with probabilities P, P,.., P n and let x < x <.. <x n Then we have F(x) P(X < x ), - < x < x, F(x) P(X < x ), P(X < x ) + P(X x ) + p p F(x) P(X < x ), P(X < x ) + P(X x ) + P(X x ) p + p F(x) P(X < x n ) P(X < x ) + P(X x ) P(X x n ) p + p +. + p n.. PROPERTIES OF DISTRIBUTION FUNCTIONS Property : P(a < X b) F(b) F(a), where F(x) P(X x) Property : P(a X b) P(X a) + F(b) F(a) Property : 3 P(a < X < b) P(a < X b) - P(X b) F(b) F(a) P(X b) by prob ()..3 PROBABILITY MASS FUNCTION (OR) PROBABILITY FUNCTION Let X be a one dimenstional discrete R.V. which takes the values x, x, To each possible outcome x i we can associate a number p i. i.e., P(X x i ) P(x i ) p i called the probability of x i. The number p i P(x i ) satisfies the following conditions. p x, (i) ( ) i i (ii) i i p(x )

8 MA645 The function p(x) satisfying the above two conditions is called the probability mass function (or) probability distribution of the R.V.X. The probability distribution {x i, p i } can be displayed in the form of table as shown below. X x i x x. x i P(X x i ) p i p p. p i Notation Let S be a sample space. The set of all outcomes S in S such that X(S) x is denoted by writing X x. P(X x) P{S : X(s) x} ly P(x a) P{S : X() (-, a)} and P(a < x b) P{s : X(s) (a, b)} P(X a or X b) P{(X a) (X b)} P(X a and X b) P{(X a) (X b)} and so on. Theorem : If X and X are random variable and K is a constant then KX, X + X, X X, K X + K X, X -X are also random variables. Theorem : If X is a random variable and f( ) is a continuous function, then f(x) is a random variable. Note If F(x) is the distribution function of one dimensional random variable then I. F(x) II. If x < y, then F(x) F(y) III. F(-) lim F(x) IV. F() x x lim F(x) V. If X is a discrete R.V. taking values x, x, x 3 Where x < x < x i- x i.. then P(X x i ) F(x i ) F(x i- ) Example:.. A random variable X has the following probability function Values of X Probability P(X) a 3a 5a 7a 9a a 3a 5a 7a (i) Determine the value of a (ii) Find P(X<3), P(X 3), P(<X<5) (iii) Find the distribution function of X. Solution 3

9 MA645 Table Values of X p(x) a 3a 5a 7a 9a a 3a 5a 7a (i) We know that if p(x) is the probability of mass function then 8 p(x ) i i p() + p() + p() + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) a + 3a + 5a + 7a + 9a + a + 3a + 5a + 7a 8 a a /8 put a /8 in table, e get table Table X x P(x) /8 3/8 5/8 7/8 9/8 /8 3/8 5/8 7/8 (ii) P(X < 3) p() + p() + p() /8+ 3/8 + 5/8 9/8 (ii) P(X 3) - p(x < 3) - 9/8 7/8 (iii) P( < x < 5) p() + p() + p(3) + p(4) here & 5 are not include 3/8 + 5/8 + 7/8 + 9/ (iv) To find the distribution function of X using table, we get X x F(X) P(x x) F() p() /8 F() P(X ) p() + p() /8 + 3/8 4/8 3 4 F() F(3) F(4) P(X ) p() + p() + p() 4/8 + 5/8 9/8 P(X 3) p() + p() + p() + p(3) 9/8 + 7/8 6/8 P(X 4) p() + p() +. + p(4) 6/8 + 9/8 5/8 4

10 MA F(5) F(6) P(X 5) p() + p() p(4) + p(5) /8 + /8 36/8 P(X 6) p() + p() p(6) 36/8 + 3/8 49/8 7 8 F(7) P(X 7) p() + p() +. + p(6) + p(7) 49/8 + 5/8 64/8 F(8) P(X 8) p() + p() p(6) + p(7) + p(8) 64/8 + 7/8 8/8.3 CONTINUOUS RANDOM VARIABLE Def : A R.V. X which takes all possible values in a given internal is called a continuous random variable. Example : Age, height, weight are continuous R.V. s..3. PROBABILITY DENSITY FUNCTION Consider a continuous R.V. X specified on a certain interval (a, b) (which can also be a infinite interval (-, )). If there is a function y f(x) such that P(x < X < x + x) lim f(x) x x Then this function f(x) is termed as the probability density function (or) simply density function of the R.V. X. It is also called the frequency function, distribution density or the probability density function. The curve y f(x) is called the probability curve of the distribution curve. Remark If f(x) is p.d.f of the R.V.X then the probability that a value of the R.V. X will fall in some interval (a, b) is equal to the definite integral of the function f(x) a to b. P(a < x < b) b a f (x)dx P(a X b) b a f (x)dx (or).3. PROPERTIES OF P.D.F The p.d.f f(x) of a R.V.X has the following properties (i) f(x), - < x < (ii) f (x)dx Remark 5

11 MA645. In the case of discrete R.V. the probability at a point say at x c is not zero. But in the case of a continuous R.V.X the probability at a point is always zero. ( ) [ ] C c P X c f (x) dx x C C. If x is a continuous R.V. then we have p(a X b) p(a X < b) p(a < X V b) IMPORTANT DEFINITIONS INTERMS OF P.D.F If f(x) is the p.d.f of a random variable X which is defined in the interval (a, b) then i ii iii iv v Arithmetic mean Harmonic mean Geometric mean G log G Moments about origin Moments about any point A b x f (x)dx a b a b f (x)dx x log x f (x)dx a b r x f (x)dx a b r (x A) f(x)dx b r vi Moment about mean µ r (x mean) f (x) dx b vii Variance µ (x mean) f (x) dx viii Mean deviation about the mean is M.D. a a a b a x mean f (x)dx.3.3 Mathematical Expectations Def :Let X be a continuous random variable with probability density function f(x). Then the mathematical expectation of X is denoted by E(X) and is given by E(X) xf(x)dx It is denoted by Thus ' r µ r x f (x)dx 6

12 MA645 ' µ E(X) ' µ E(X ) Mean X µ E(X) And Variance µ µ ' ' Variance E(X ) [E(X)] * r th moment (abut mean) Now Thus ' { ( )} r r ( µ about origin) ' ' ( µ about origin) E X E X {x E(X)} f(x)dx r {x X} f (x)dx r r {x X} f (x)dx µ r Where µ r E[X E(X) ] This gives the r th moment about mean and it is denoted by µ r Put r in (B) we get µ r {x X}f (x) dx (a) (b) x f (x)dx x f (x)dx X X f (x)dx f (x)dx X X µ Put r in (B), we get (x X) f(x)dx µ Variance µ E[X E(X)] Which gives the variance interms of expectations. Note Let g(x) K (Constant), then 7

13 MA645 E g( X ) E( K) Kf(x)dx K f (x)dx f (x)dx K. K Thus E(K) K E[a constant] constant..3.4 EXPECTATIONS (Discrete R.V. s) Let X be a discrete random variable with P.M.F p(x) Then E(X) x p(x) For discrete random variables X r E(X ) If we denote r ' E(X ) µ r Then x r x p(x) x µ E[X ] x p(x) ' r r r x Put r, we get Mean µ x p(x) ' r Put r, we get µ E[X ] x p(x) ' x ' ' µ µ µ E(X ) E(X) The r th moment about mean ' r µ r E[{X E(X)} ] { } Put r, we get Variance µ (x X) p(x) x r (x X) p(x), E(X) X x 8 (by def).3.5 ADDITION THEOREM (EXPECTATION) Theorem If X and Y are two continuous random variable with pdf f x (x) and f y (y) then E(X+Y) E(X) + E(Y)

14 MA MULTIPLICATION THEOREM OF EXPECTATION Theorem If X and Y are independent random variables, Then E(XY) E(X). E(Y) Note : If X, X,, X n are n independent random variables, then E[X, X,, X n ] E(X ), E(X ),, E(X n ) Theorem 3 If X is a random variable with pdf f(x) and a is a constant, then (i) E[a G(x)] a E[G(x)] (ii) E[G(x)+a] E[G(x)+a] Where G(X) is a function of X which is also a random variable. Theorem 4 If X is a random variable with p.d.f. f(x) and a and b are constants, then E[ax + b] a E(X) + b Cor : If we take a and b E(X) X, then we get E(X- X ) E(X) E(X) Note E X E(X) E[log (x)] log E(X) E(X ) [E(X)].3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES Let X, X,, X n be any n random variable and if a, a,, a n are constants, then E[a X + a X + + a n X n ] a E(X ) + a E(X )+ + a n E(X n ) Result If X is a random variable, then Var (ax + b) a Var(X) a and b are constants. Covariance : If X and Y are random variables, then covariance between them is defined as Cov(X, Y) E{[X - E(X)] [Y - E(Y)]} E{XY - XE(Y) E(X)Y + E(X)E(Y)} Cov(X, Y) E(XY) E(X). E(Y) (A) If X and Y are independent, then E(XY) E(X) E(Y) Sub (B) in (A), we get Cov (X, Y) If X and Y are independent, then 9

15 MA645 Cov (X, Y) Note (i) Cov(aX, by) ab Cov(X, Y) (ii) Cov(X+a, Y+b) Cov(X, Y) (iii) Cov(aX+b, cy+d) ac Cov(X, Y) (iv) Var (X + X ) Var(X ) + Var(X ) + Cov(X, X ) If X, X are independent Var (X + X ) Var(X ) + Var(X ) EXPECTATION TABLE Discrete R.V s Continuous R.V s. E(X) x p(x). E(X) x f (x)dx. E(X ) µ x p(x). r ' r r x r ' E(X ) µ r ' r r x f (x)dx ' 3. Mean µ r x p(x) 3. Mean µ x f (x)dx 4. ' µ x p(x) Variance ' µ x f (x)dx ' ' µ µ E(X ) {E(X)} 5. Variance ' ' µ µ E(X ) {E(X)} SOLVED PROBLEMS ON DISCRETE R.V S Example : When die is thrown, X denotes the number that turns up. Find E(X), E(X ) and Var (X). Solution Let X be the R.V. denoting the number that turns up in a die. X takes values,, 3, 4, 5, 6 and with probability /6 for each X x p(x) Now /6 /6 /6 /6 /6 /6 p(x ) p(x ) p(x 3 ) p(x 4 ) p(x 5 ) p(x 6 ) E(X) x p(x ) 6 i i i x p(x ) + x p(x ) + x 3 p(x 3 ) + x 4 p(x 4 ) + x 5 p(x 5 ) + x 6 p(x 6 ) x (/6) + x (/6) + 3 x (/6) + 4 x (/6) + 5 x (/6) + 6 x (/6) /6 7/ ()

16 MA645 E(X) x p(x ) 6 i i p x p(x )+x p(x )+x 3 p(x 3 )+x 4 p(x 4 )+x 5 p(x 5 )+x 6 p(x 6 ) (/6) + 4(/6) + 9(/6) + 6(/6) + 5(/6) + 36(/6) () Variance (X) Var (X) E(X ) [E(X)] Example : Find the value of (i) C (ii) mean of the following distribution given C(x x ), < x < f(x) otherwise Solution C(x x ), < x < Given f(x) () otherwise f (x)dx C(x x )dx 3 [using ()] [ <x<] x x C 3 C 3 3 C 6 C 6 C 6 () Sub () in (), f(x) 6(x x ), < x < (3) Mean E(x) x f (x)dx x 6(x x )dx [from (3)] [ < x < ] (6x x )dx 3

17 MA645 Mean ½ 6x 6x 3 4 Mean C ½ CONTINUOUS DISTRIBUTION FUNCTION Def : If f(x) is a p.d.f. of a continuous random variable X, then the function F X (x) F(x) P(X x) f (x)dx, < x < is called the distribution function or cumulative distribution function of the random variable. * PROPERTIES OF CDF OF A R.V. X (i) F(x), - < x < (ii) Lt F(x), Lt F(x) x x (iii) P(a X b) b f (x)dx F(b) F(a) a df(x) (iv) F'(x) f(x) dx (v) P(X x i ) F(x i ) F(x i ) Example :.4. Given the p.d.f. of a continuous random variable X follows 6x( x), < x < f(x), find c.d.f. for X otherwise Solution 6x( x), < x < Given f(x) otherwise x The c.d.f is F(x) f(x)dx, < x < (i) When x <, then x F(x) f(x)dx x dx (ii) When < x <, then

18 MA645 F(x) x f(x)dx x f (x)dx + f (x)dx 3x x (iii) When x >, then x F(x) f(x)dx + x 6x( x)dx 3 x dx + 6x( x) dx + dx 6 (x x )dx Using (), () & (3) we get, x < 3 F(x) 3x x, < x <, x > x x x 3 6 x( x)dx 6 3 Example:.4. x e, x (i) If f(x) defined as follows a density function?, x < (ii) If so determine the probability that the variate having this density will fall in the interval (, ). Solution x e, x Given f(x), x < (a) In (, ), e -x is +ve f(x) in (, ) x (b) f (x)dx f (x)dx + f (x)dx + x x dx e dx e e + 3

19 MA645 Hence f(x) is a p.d.f (ii) We know that P(a X b) b a f (x) dx P( X ) x x f (x) dx e dx [ e ] e dx [ e ] x x + -e - + e Example:.4..3 A probability curve y f(x) has a range from to. If f(x) e -x, find the mean and variance and the third moment about mean. Solution Mean x f (x) dx Mean Variance x x x xe dx x[ e ] [e ] (x Mean) f (x) dx x µ (x ) e dx µ Third moment about mean b µ Here a, b (x Mean) f (x)dx a b 3 x (x ) e dx a µ 3 x x x x { + } (x ) ( e ) 3(x ) (e ) 6(x )( e ) 6(e ) µ 3.5 MOMENT GENERATING FUNCTION Def : The moment generating function (MGF) of a random variable X (about origin) whose probability function f(x) is given by M X (t) E[e tx ] + 4

20 MA645 tx e f (x)dx, for a continuous probably function x tx e p(x), for a discrete probably function x Where t is real parameter and the integration or summation being extended to the entire range of x. Example :.5. Prove that the r th r t ' moment of the R.V. X about origin is M X(t) µ r r r! Proof WKT M X (t) E(e tx ) 3 r tx (tx) (tx) (tx) E !! 3! r! r t t r E[] + t E(X) + E(X ) E(X ) +...! r! 3 r ' t ' t ' t ' M X(t) + t µ+ µ + µ µ+ r...! 3! r! ' r [using µ r E(X )] r Thus r th t moment coefficient of r! Note. The above results gives MGF interms of moments.. Since M X (t) generates moments, it is known as moment generating function. Example:.5. ' ' Find µ and µ from M X (t) Proof r t ' WKT M X(t) µ r r r! r ' t ' t ' t ' M X(t) µ+ µ+ µ µ r (A)!! r! Differenting (A) W.R.T t, we get 3 ' ' t ' t ' M X(t) µ+ µ + µ (B)! 3! Put t in (B), we get ' ' M X() µ Mean ' d Mean M () (or) (M X (t)) dt t 5

21 MA645 " ' ' M X (t) µ + t µ Put t in (B) " ' d M X () µ (or) (M X (t)) dt t r ' d In general µ r r (M X (t)) dt t Example :.5.3 Obtain the MGF of X about the point X a. Proof t(x a) The moment generating function of X about the point X a is M (t) E[e ] r t t r E + t(x a) + (X a) (X a) +...! r! Formula x x x e !! r t t r E () + E[t(X a)] + E[ (X a) ] E[ (X a) ] +...! r! r t t r + t E(X a) + E(X a) E(X a) +...! r! r ' t ' t ' ' r + t µ+ µ µ+ r... Where µ r E[(X a) ]! r! r ' t ' t ' [ M X(t) ] +µ+ t x a µ µ+ r...! r! Result: tcx M (t) E[e ] () CX ctx M X(t) E[e ] () From () & () we get M CX(t) M X(ct) Example :.5.4 If X, X,.., X n are independent variables, then prove that t(x+ X X n) M (t) E[e ] X + X X n tx tx txn E[e.e...e ] n tx tx tx E(e ).E(e )...E(e ) [ X, X,.., X n are independent] X 6

22 MA645 M X (t).m X (t)...m X (t) n Example:.5.5 at t X a h h Prove that if, then M (t) e.m X, where a, h are constants. h Proof By definition tu tx M (t) E e M X(t) E[e ] X a t h E e tx ta n n E e tx h E[ e ] E[ e e ta h tx h ta h ] E[ e ] [by def] ta t h h. M X at h X e t M (t) e.m h, where X a and M X (t) is the MGF about origin. h Example:.5.6 Find the MGF for the distribution where at x 3 f (x) at x 3 otherwise Solution Given f () 3 f () 3 f(3) f(4) MGF of a R.V. X is given by 7

23 MA645 tx M X(t) E[e ] tx e f(x) x e f() + e t f() + e t f() +. +e t f(/3) + e t f(/3) + /3e t + /3e t t e t MGF is M X(t) [ + e ] 3.6 Discrete Distributions The important discrete distribution of a random variable X are. Binomial Distribution. Poisson Distribution 3. Geometric Distribution.6. BINOMIAL DISTRIBUTION Def : A random variable X is said to follow binomial distribution if its probability law is given by P(x) p(x x successes) nc x p x q n-x Where x,,,., n, p+q Note Assumptions in Binomial distribution i) There are only two possible outcomes for each trail (success or failure). ii) The probability of a success is the same for each trail. iii) There are n trails, where n is a constant. iv) The n trails are independent. Example :.6. Find the Moment Generating Function (MGF) of a binomial distribution about origin. Solution WKT M (t) X n tx e p(x) x Let X be a random variable which follows binomial distribution then MGF about origin is given by n tx tx E[e ] M X(t) e p(x) x n tx x n x e ncxp q x n tx x n x (e )p ncxq x n t x n x (pe ) ncxq x p(x) x n x ncxp q 8

24 MA645 t n M X(t) (q + pe ) Example:.6. Find the mean and variance of binomial distribution. Solution t n M (t) (q + pe ) X M (t) n(q + pe ).pe Put t, we get ' M () n n(q + p).p X ' t n t X (q p) Mean M () + X Mean E(X) np [ ] ' " t n t t t n t M X(t) np (q + pe ).e + e (n )(q + pe ).pe Put t, we get " n n M X(t) np (q + p) + (n )(q + p).p np + (n )p [ ] " X np + n p np n p + np( p) M () n p npq + [ p q] M () E(X ) n p + npq " X ( ) Var X E(X ) [E(X)] n / p + npq n / p npq Var (X) npq S.D npq Example :.6.3 Find the Moment Generating Function (MGF) of a binomial distribution about mean (np). Solution Wkt the MGF of a random variable X about any point a is M x (t) (about X a) E[e t(x-a) ] Here a is mean of the binomial distribution M X (t) (about X np) E[e t(x-np) ] E[e tx. e -tnp) ] e -tnp. [-[e -tx) ]] e -tnp. (q+pe t )- n (e -tp ) n. (q + pe t ) n MGF about mean (e -tp ) n. (q + pe t ) n Example :.6.4 Additive property of binomial distribution. Solution 9

25 MA645 The sum of two binomial variants is not a binomial variate. Let X and Y be two independent binomial variates with parameter (n, p ) and (n, p ) respectively. Then t n ( ) ( + ) t n, M ( t ) ( q + p e) M t q p e X ( ) ( ) ( ) X+Y X Y Y M t M t.m t [ X&Yare independent R.V. s] t n ( q + pe ) t. ( q + pe) t RHS cannot be expressed in the form ( q + pe ) n. Hence by uniqueness theorem of MGF X+Y is not a binomial variate. Hence in general, the sum of two binomial variates is not a binomial variate. Example :.6.5 t n t n M (t) q+pe, M (t) q+pe, then If X ( ) Y ( ) t n ( ) + n M X+ Y(t) q+pe Problems on Binomial Distribution. Check whether the following data follow a binomial distribution or not. Mean 3; variance 4. Solution Given Mean np 3 () Variance npr 4 () () np 3 () npq 4 4 q which is >. 3 3 Since q > which is not possible ( < q < ). The given data not follow binomial distribution. Example :.6.5 The mean and SD of a binomial distribution are 5 and, determine the distribution. Solution Given Mean np 5 () SD npq () () np 4 4 q () npq p p Sub (3) in () we get n x / n

26 MA645 n 5 The binomial distribution is P(X x) p(x) nc x p x q n-x 5C x (/5) x (4/5) n-x, x,,,.., 5.7 Passion Distribution Def : A random variable X is said to follow if its probability law is given by x e λ λ P(X x) p(x), x,,,.., x! Poisson distribution is a limiting case of binomial distribution under the following conditions or assumptions.. The number of trails n should e infinitely large i.e. n.. The probability of successes p for each trail is infinitely small. 3. np λ, should be finite where λ is a constant. * To find MGF M X (t) E(e tx ) tx e p(x) x x tx λ Hence M X (t) λ e e x x! λ t x e ( λe) x x! t x λ ( λe) e x x! t λ t ( λe) e +λ e ! t t λ λe λ(e ) e e e (e ) e λ * To find Mean and Variance t WKT M (t) e λ ' X t t (e ) λ(e ) t M X '(t) e.e λ M '() e. λ X µ E(X) x.p(x) x

27 MA645 e λ x. e λλ x. x x! x x! x λ x. λ + e. λ x x! x λ λ λe. x (x )! λ x λ x λ λ λ e +λ+ +...! λ λ λ e.e Mean λ λ ' e λ µ E[X] x.p(x) x. x x x! λ x e λ {x(x ) + x}. x x! λ x λ x x(x )e λ x.e λ + x x! x x! x λ λ e λ +λ x (x )(x 3)... x λ λ e λ +λ x (x )! λ λ λ e λ λ!! λ +λ Variance µ E(X ) [E(X)] λ +λ λ λ Variance λ Hence Mean Variance λ Note : * sum of independent Poisson Vairates is also Poisson variate. PROBLEMS ON POISSON DISTRIBUTION Example:.7. If x is a Poisson variate such that P(X) 3 and P(X) 5 Solution x e λ λ P(X x) x! x, find the P(X) and P(X3).

28 MA645 P(X) e λ 3 (Given) λ 3 λ e () P(X) e λ! 5 (Given) e λ! 5 () 3 () e λ (3) () e λ 5 (4) (3) 3 (4) λ 4 4 λ 3 P(X) λ e λ 4/3 e! P(X3) λ 3 4/3 3 e λ e (4 / 3) 3! 3! Example :.7. If X is a Poisson variable P(X ) 9 P(X 4) + 9 P(X6) Find (i) Mean if X (ii) Variance of X Solution x e λ P(Xx), x,,,... x! Given P(X ) 9 P(X 4) + 9 P(X6) e 4 λ e 6 λ e λ 9 + 9! 4! 6! 4 9λ 9λ + 4! 6! 4 3λ λ λ λ

29 MA645 4 λ + 3λ 4 λ or λ -4 λ± or λ± i Mean λ, Variance λ Standard Deviation.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial distribution Solution We know that the Binomial distribution is P(Xx) nc x p x q n-x n! x n x p ( p) (n x)! x! n n..3...(n x)(n x + )...np ( p) x..3...(n x) x! ( p)..3...(n x)(n x + )...n p n ( p)..3...(n x) x! p x n(n )(n )...(n x + ) λ λ x x x! n λ n n n x n(n )(n )...(n x + ) λ λ x x! n n x... n x n n n x λ P(Xx) λ x! n x n x λ x λ... x! n n n n When n x n λ x λ P(Xx) lt... x! n n n n n x λ x lt lt... lt x! n n n n n n We know that x n x 4

30 MA645 and n n x λ λ lt e n x lt lt... lt n n n x λ λ P(Xx) e,x,,,... x! n n n.8 GEOMETRIC DISTRIBUTION Def: A discrete random variable X is said to follow geometric distribution, if it assumes only non-negative values and its probability mass function is given by P(Xx) p(x) q x- ; x,,., < p <, Where q -p Example:.8. To find MGF M X (t) E[e tx ] tx e p(x) e q tx x p x tx x e qq p x tx x e q p/q x tx x p/q e q x t x p / q (e q) x t t t 3 p / q (e q) + (e q) + (e q) +... Let x e t 3 q p / q x + x + x +... p p x + x + x +... q ( x) q M X (t) p qe t qe t q pe t qe * To find the Mean & Variance t pe [ qe ] t t 5

31 MA645 M ( qe ) t t t t ' ( qe )pe pe ( qe ) X ( t ) t M /p ' E(X) ( ) X Mean /p Variance t pe ( qe ) 6 t t " d pe µ X(t) t dt ( qe ) t t t t t ( qe ) pe pe ( qe )( qe ) t 4 ( qe ) t t t t t ( qe ) pe + pe qe ( qe ) t 4 ( qe ) " + q M X() p Var (X) E(X ) [E(X)] ( + q) q p p p q Var (X) p Note: Another form of geometric distribution P[Xx] q x p ; x,,,. p M X(t) t ( qe ) Mean q/p, Variance q/p Example:.8. If the MGF of X is (5-4et) -, find the distribution of X and P(X5) Solution Let the geometric distribution be P(X x) q x p, x,,,.. The MGF of geometric distribution is given by p t qe () Here M X (t) (5-4e t ) - 4 t 5 e 5 () Company () & () we get 4 q ; p 5 5 P(X x) pq x, x,,, 3,.

32 MA x P(X 5) CONTINUOUS DISTRIBUTIONS If X is a continuous random variable then we have the following distribution. Uniform (Rectangular Distribution). Exponential Distribution 3. Gamma Distribution 4. Normal Distribution. 9. Uniform Distribution (Rectangular Distribution) Def : A random variable X is set to follow uniform distribution if its,a < x < b f(x) b a, otherwise * To find MGF M X(t) tx e f(x)dx b tx e dx a b a tx a e b a t b bx at e e (b a)t The MGF of uniform distribution is bt at e e M X(t) (b a)t * To find Mean and Variance E(X) x f (x)dx x b b bx dx x dx a b a b a a b a b a b+ a a+ b (b a) b a 7

33 MA645 ' a+ b Mean µ Putting r in (A), we get b b ' x µ x f (x)dx dx a a b a a + ab + b 3 Variance Variance (b a) µ µ ' ' b + ab + b b + a (b a) 3 PROBLEMS ON UNIFORM DISTRIBUTION Example.9. If X is uniformly distributed over (-α,α), α<, find α so that (i) P(X>) /3 (ii) P( X < ) P( X > ) Solution If X is uniformly distributed in (-α, α), then its p.d.f. is α < x < α f(x) α otherwise (i) P(X>) /3 α f (x)dx / 3 α dx / 3 α α ( x ) /3 ( α ) /3 α α α 3 (ii) P( X < ) P( X > ) - P( X < ) P( X < ) + P( X < ) P( X < ) P(- < X < ) f (x)dx 8

34 MA645 / dx / α α Note:. The distribution function F(x) is given by α < x < α x a F(x) a x b b a b< x <. The p.d.f. of a uniform variate X in (-a, a) is given by a < x < a F(x) a otherwise. THE EXPONENTIAL DISTRIBUTION Def :A continuous random variable X is said to follow an exponential distribution with parameter λ> if its probability density function is given by λx λ e x > a F(x) otherwise To find MGF Solution M X(t) tx e f(x)dx tx λx e λe dx λ e ( λ t)x dx ( λ t)x e λ t λ λ λ e e ( λ t) λ t λ MGF of x λ t, λ> t * To find Mean and Variance We know that MGF is 9

35 MA645 M X (t) λ t t t λ λ λ Mean M X (t) r t t t r λ λ λ r t t! t t! r λ! λ r! λ t λ r r ' t µ Coefficient of! λ ' t µ Coefficient of! λ Variance Variance λ λ λ λ ' ' µ µ µ Mean λ Example:.. Let X be a random variable with p.d.f x e 3 x F(x) > 3 otherwise Find ) P(X > 3) ) MGF of X Solution WKT the exponential distribution is F(x) λe -λx, x > MGF is Here λ 3 x 3 P(x>3) f (x)dx e dx 3 33 P(X>3) e - M X(t) λ λ t 3

36 MA645 3 t 3 M X (t) Note 3t 3 3t 3t 3 If X is exponentially distributed, then P(X > s+t / x > s) P(X > t), for any s, t >.. GAMMA DISTRIBUTION Definition A Continuous random variable X taking non-negative values is said to follow gamma distribution, if its probability density function is given by f(x), α>, < x <, elsewhere and dx, elsewhere When α is the parameter of the distribution. Additive property of Gamma Variates If X,X, X 3,... X k are independent gamma variates with parameters λ, λ,.. λ k respectively then X+X + X X k is also a gamma variates with parameter λ + λ λ k. Example :.. Customer demand for milk in a certain locality,per month, is Known to be a general Gamma RV.If the average demand is a liters and the most likely demand b liters (b<a), what is the varience of the demand? Solution : Let X be represent the monthly Customer demand for milk. Average demand is the value of E(X). Most likely demand is the value of the mode of X or the value of X for which its density function is maximum. If f(x) is the its density function of X,then f(x). x k- e -λx e -λx, x> 3

37 MA645 f(x).[(k-) x k- e -λx - e -λx ],when x, x f (x).[(k-) x k- e -λx - e -λx ] <, when x Therefour f(x) is maximum, when x i.e,most likely demand b.() and E(X) () Now V(X) a (a-b) From () and () TUTORIAL QUESTIONS.It is known that the probability of an item produced by a certain machine will be defective is.5. If the produced items are sent to the market in packets of, fine the no. of packets containing at least, exactly and atmost defective items in a consignment of packets using (i) Binomial distribution (ii) Poisson approximation to binomial distribution.. The daily consumption of milk in excess of, gallons is approximately exponentially distributed with. 3 θ The city has a daily stock of 35, gallons. What is the probability that of two days selected at random, the stock is insufficient for both days. 3.The density function of a random variable X is given by f(x) KX(-X), X.Find K, mean, variance and r th moment. 4.A binomial variable X satisfies the relation 9P(X4)P(X) when n6. Find the parameter p of the Binomial distribution. 5. Find the M.G.F for Poisson Distribution. 6. If X and Y are independent Poisson variates such that P(X)P(X) and P(Y)P(Y3). Find V(X-Y). 7.A discrete random variable has the following probability distribution X: P(X) a 3a 5a 7a 9a a 3a 5a 7a Find the value of a, P(X<3) and c.d.f of X. 3

38 MA In a component manufacturing industry, there is a small probability of /5 for any component to be defective. The components are supplied in packets of. Use Poisson distribution to calculate the approximate number of packets containing (). No defective. (). Two defective components in a consignment of, packets. WORKED OUT EXAMPLES Example : Given the p.d.f. of a continuous random variable X follows 6x( x), < x < f(x), find c.d.f. for X otherwise Solution 6x( x), < x < Given f(x) otherwise x The c.d.f is F(x) f(x)dx, < x < (i) When x <, then x F(x) f(x)dx x dx (ii) When < x <, then x F(x) f(x)dx x f (x)dx + f (x)dx 3x x (iii) When x >, then x F(x) f(x)dx + x 6x( x)dx 3 x dx + 6x( x) dx + dx 6 (x x )dx x x x 3 6 x( x)dx x

39 MA645 Using (), () & (3) we get, x < 3 F(x) 3x x, < x <, x > Example : A random variable X has the following probability function Values of X Probability P(X) a 3a 5a 7a 9a a 3a 5a 7a (i) Determine the value of a (ii) Find P(X<3), P(X 3), P(<X<5) (iii) Find the distribution function of X. Solution Table Values of X p(x) a 3a 5a 7a 9a a 3a 5a 7a (i) We know that if p(x) is the probability of mass function then 8 p(x ) i i p() + p() + p() + p(3) + p(4) + p(5) + p(6) + p(7) + p(8) a + 3a + 5a + 7a + 9a + a + 3a + 5a + 7a 8 a a /8 put a /8 in table, e get table Table X x P(x) /8 3/8 5/8 7/8 9/8 /8 3/8 5/8 7/8 (ii) P(X < 3) p() + p() + p() /8+ 3/8 + 5/8 9/8 (ii) P(X 3) - p(x < 3) - 9/8 7/8 (iii) P( < x < 5) p() + p() + p(3) + p(4) here & 5 are not include 3/8 + 5/8 + 7/8 + 9/ (iv) To find the distribution function of X using table, we get 34

40 MA645 X x F(X) P(x x) F() p() / F() P(X ) p() + p() /8 + 3/8 4/8 F() P(X ) p() + p() + p() 4/8 + 5/8 9/8 F(3) P(X 3) p() + p() + p() + p(3) 9/8 + 7/8 6/8 F(4) P(X 4) p() + p() +. + p(4) 6/8 + 9/8 5/8 F(5) P(X 5) p() + p() p(4) + p(5) /8 + /8 36/8 F(6) P(X 6) p() + p() p(6) 36/8 + 3/8 49/8 F(7) P(X 7) p() + p() +. + p(6) + p(7) 49/8 + 5/8 64/8 F(8) P(X 8) p() + p() p(6) + p(7) + p(8) 64/8 + 7/8 8/8 Example :3 The mean and SD of a binomial distribution are 5 and, determine the distribution. Solution Given Mean np 5 () SD npq () () np 4 4 q () npq p p Sub (3) in () we get n x /5 5 n 5 The binomial distribution is P(X x) p(x) nc x p x q n-x 5C x (/5) x (4/5) n-x, x,,,.., 5 Example :4 If X is a Poisson variable 35

41 MA645 P(X ) 9 P(X 4) + 9 P(X6) Find (i) Mean if X (ii) Variance of X Solution x e λ P(Xx), x,,,... x! Given P(X ) 9 P(X 4) + 9 P(X6) e 4 λ e 6 λ e λ 9 + 9! 4! 6! 4 9λ 9λ + 4! 6! 4 3λ λ λ λ λ + 3λ 4 λ or λ -4 λ± or λ± i Mean λ, Variance λ Standard Deviation 36

42 MA645 UNIT II TWO DIMENSIONAL RANDOM VARIABLES Introduction In the previous chapter we studied various aspects of the theory of a single R.V. In this chapter we extend our theory to include two R.V's one for each coordinator axis X and Y of the XY Plane. DEFINITION : Let S be the sample space. Let X X(S) & Y Y(S) be two functions each assigning a real number to each outcome s S. hen (X, Y) is a two dimensional random variable.. Types of random variables. Discrete R.V. s. Continuous R.V. s Discrete R.V. s (Two Dimensional Discrete R.V. s) If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete R.V. and it can be represented by (x i, y), i,,.,m. In the study of two dimensional discrete R.V. s we have the following 5 important terms. Joint Probability Function (JPF) (or) Joint Probability Mass Function. Joint Probability Distribution. Marginal Probability Function of X. Marginal Probability Function of Y. Conditional Probability Function... Joint Probability Function of discrete R.V. s X and Y The function P(X x i, Y y j ) P(x i, y j ) is called the joint probability function for discrete random variable X and Y is denote by p ij. Note. P(X x i, Y y j ) P[(X x i ) (Y y j )] p ij. It should satisfies the following conditions (i) p ij i, j (ii) Σ j Σ i p ij.. Marginal Probability Function of X If the joint probability distribution of two random variables X and Y is given then the marginal probability function of X is given by P x (x i ) p i (marginal probability function of Y) Conditional Probabilities The conditional probabilities function of X given Y y j is given by P[X x i / Y y j ] p ij P[X x i / Y y j ] P[Y y j ] p.j 37

43 MA645 The set {x i, p ij / p.j }, i,, 3,..is called the conditional probability distribution of X given Y y j. The conditional probability function of Y given X x i is given by P[Y y i / X x j ] p ij P[Y y i / X x j ] P[X x j ] p i. The set {y i, p ij / p i. }, j,, 3,..is called the conditional probability distribution of Y given X x i. SOLVED PROBLEMS ON MARGINAL DISTRIBUTION Example:.. From the following joint distribution of X and Y find the marginal distributions. X Y 3/8 9/8 3/8 3/4 3/4 /8 Solution X Y P Y (y) p(yy) 3/8 P(,) 3/8 P(,) 5/8 P y () 3/4 P(, ) 3/4 P(,) 6/4 P y () /8 P(,) P(,) /8 P y () P X (X) P(Xx) /8 5/4 P X () 38 3/8 P X () The marginal distribution of X P X () P(X ) p(,) + p(,) + p(,) 5/4 P X () P(X ) p(,) + p(,) + p(,) 5/8 P X () P(X ) p(,) + p(,) + p(,) 3/8 Marginal probability function of X is 5, x 4 5 P X(x), x 8 3, x 8 The marginal distribution of Y P Y () P(Y ) p(,) + p(,) + p(,) 5/8 P Y () P(Y ) p(,) + p(,) + p(,) 3/7

44 MA645 P Y () P(Y ) p(,) + p(,) + p(,) /8 Marginal probability function of Y is 5, y 8 3 P Y(y), y 7, y 8.3 CONTINUOUS RANDOM VARIABLES Two dimensional continuous R.V. s If (X, Y) can take all the values in a region R in the XY plans then (X, Y) is called twodimensional continuous random variable. Joint probability density function : (i) f XY (x,y) ; (ii) f (x, y) dydx XY Joint probability distribution function F(x,y) P[X x, Y y] x y f (x, y)dx dy Marginal probability density function f(x) f X (x) f(y) f Y (x) f (x, y)dy (Marginal pdf of X) x,y f (x, y)dy (Marginal pdf of Y) x,y Conditional probability density function (i) P(Y y / X x) f(y / x) f (x,y), f(x) > f(x) (ii) P(X x / Y y) f (x / y) f (x,y), f (y) > f (y) Example :.3. Show that the function is a joint density function of X and Y. Solution (x + 3y), < x <, < y < f (x,y) 5 otherwise 39

45 MA645 We know that if f(x,y) satisfies the conditions (i) f (x,y) (ii) f (x,y), then f(x,y) is a jdf (x + 3y), < x <, < y < Given f (x,y) 5 otherwise (i) f (x,y) in the given interval (x,y) (ii) f (x, y)dx dy (x + 3y)dx dy 5 + 3xy dy x 5 ( + 3y)dy Since f(x,y) satisfies the two conditions it is a j.d.f. Example :.3. The j.d.f of the random variables X and Y is given 8xy, < x <, < y < x f (x,y), otherwise Find (i) f X (x) (ii) f Y (y) (iii) f(y/x) Solution We know that (i) The marginal pdf of X is f X (x) f(x) 3 f(x) 4x, < x < (ii) The marginal pdf of Y is f Y (y) f(y) f (x, y)dy 8xydy 4x f (x, y)dy 8xydy 4y f (y) 4y, < y <α (iii) We know that f (x,y) f( y/x) f(x) 8xy y 3, y x, x 4x x < < < < x 4 3y 3 y

46 MA645 Result Marginal pdf g Marginal pdf y F(y/x) 4x 3, <x< 4y, <y<x y, < y < x, < x < x.4 REGRESSION * Line of regression The line of regression of X on Y is given by σy x x r. (y y) σx The line of regression of Y on X is given by σy y y r. (x x) σx * Angle between two lines of Regression. r σσ y x tan θ r σ +σ x y * Regression coefficient Regression coefficients of Y on X σ y r. byx σx Regression coefficient of X and Y σ x r. bxy σy Correlation coefficient r ± bxy byx Example:.4.. From the following data, find (i) The two regression equation (ii) The coefficient of correlation between the marks in Economic and Statistics. (iii) The most likely marks in statistics when marks in Economic are 3. Marks in Economics Marks in Statistics Solution 4

47 MA645 ( X X) X Y X X X 3 X Y Y 38 4 ( Y Y) ( X X) ( Y Y) Here X 3 Y 38 X 3 and Y 38 n n Coefficient of regression of Y on X is (X X)(Y Y) 93 byx.6643 (X X) 4 Coefficient of regression of X on Y is (X X)(Y Y) 93 bxy.337 (Y Y) 398 Equation of the line of regression of X and Y is X X b XY(Y Y) X (y 38) X y x X y Equation of the line of regression of Y on X is Y Y b YX(X X) Y (x 3) Y x x x Coefficient of Correlation r b YX b XY x (-.337) r.55 r ±.55 r ±.394 Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are 3. y x

48 MA645 Put x 3, we get y x y~ 39.5 COVARIANCE Def : If X and Y are random variables, then Covariance between X and Y is defined as Cov (X, Y) E(XY) E(X). E(Y) Cov (X, Y) [If X & Y are independent].6 CORRELATION Types of Correlation Positive Correlation (If two variables deviate in same direction) Negative Correlation (If two variables constantly deviate in opposite direction).7 KARL-PEARSON S COEFFICIENT OF CORRELATION Correlation coefficient between two random variables X and Y usually denoted by r(x, Y) is a numerical measure of linear relationship between them and is defined as Cov(X,Y) r(x,y), σ. σ X Y Where Cov (X, Y) XY X Y n X Y σ X ; σ Y n n * Limits of correlation coefficient - r. X & Y independent, r(x, Y). Note :Types of correlation based on r. Values of r Correlation is said to be r perfect and positive <r< positive -<r< negative r Uncorrelated SOLVED PROBLEMS ON CORRELATION Example :.6. Calculated the correlation coefficient for the following heights of fathers X and their sons Y. X Y

49 MA645 Solution X Y U X 68 V Y 68 UV U V U V UV 4 U 36 V 5 Now U U n 8 V 8 V n 8 Cov (X, Y) Cov (U, V) UV 4 UV 3 n 8 () U 36 σ U U n 8. () V 5 σ V V.345 (3) n 8 Cov(U, V) 3 r(x, Y) r(u, V) σu. σv. x (by,, 3) Example :.6. Let X be a random variable with p.d.f. Y x, find the correlation coefficient between X and Y. f(x), x and let Solution 44

50 MA645 E(X) E(X) E(Y) x.f (x) dx x. dx x.f(x) dx x. dx E(XY) E(XX ) E(X 3 ) x 3 x x x.f (x) dx 4 E(XY) Cov(X,Y) r(x, Y) ρ(x, Y) σσ X Y ρ. Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y..8 TRANSFORMS OF TWO DIMENSIONAL RANDOM VARIABLE Formula: f (u) f (u,v)dv U u,v & V u,v f (u) f (u,v)du (x,y) f UV(u,V) f XY(x,y) (u,v) Example : If the joint pdf of (X, Y) is given by f xy (x, y) x+y, x, y, find the pdf of XY. Solution Given f xy (x, y) x + y Given U XY Let V Y u x &y V v 45

51 MA645 x x u y y. ; ; u V v V u v y x u (x,y) u v J V V (u,v) y y u v J V The joint p.d.f. (u, v) is given by f (u,v) f (x,y) J uv xy (x + y) v v v () () u + u V v The range of V : Since y, we have V ( V y) The range of u : Given x u v u v Hence the p.d.f. of (u, v) is given by u f uv(u,v) + v, u v, v v v Now f (u) f (u,v)dv U f u u,v u,v (u, v)dv u + dv u v v v + u. f u (u) (-u), < u < p.d.f of (u, v) u p.d.f of u XY (3) 46

52 MA645 u f uv(u,v) + v v v u v, v f u (u) (-u), < u < TUTORIAL QUESTIONS. The jpdf of r.v X and Y is given by f(x,y)3(x+y),<x<,<y<,x+y< and otherwise. Find the marginal pdf of X and Y and ii) Cov(X,Y).. Obtain the correlation coefficient for the following data: X: Y: The two lines of regression are 8X-Y+66, 4X-8Y-4.The variance of x is 9 find i) The mean value of x and y. ii) Correlation coefficient between x and y. 4. If X,X, X n are Poisson variates with parameter λ, use the central limit theorem to find P( S n 6) where SnX +X + X n and n If the joint probability density function of a two dimensional random variable (X,Y) is given by f(x, y) x +, <x<,<y<, elsewhere Find (i) P(X>/)(ii) P(Y<X) and (iii) P(Y<// X</). 6. Two random variables X and Y have joint density Find Cov (X,Y). 7. If the equations of the two lines of regression of y on x and x on y are respectively 7x-6y+9; 5y-4x-3, calculate the coefficient of correlation. WORKEDOUT EXAMPLES Example The j.d.f of the random variables X and Y is given 8xy, < x <, < y < x f (x,y), otherwise Find (i) f X (x) (ii) f Y (y) (iii) f(y/x) Solution We know that (i) The marginal pdf of X is f X (x) f(x) 3 f(x) 4x, < x < (ii) The marginal pdf of Y is f Y (y) f(y) f (x, y)dy 8xydy 4x f (x, y)dy 8xydy 4y f (y) 4y, < y <α x 47 3

53 MA645 (iii) We know that ( ) f y/x f (x,y) f(x) 8xy y 3, y x, x 4x x Example Let X be a random variable with p.d.f. Y x, find the correlation coefficient between X and Y. f(x), x and let Solution E(X) E(X) E(Y) x.f (x) dx x. dx x.f(x) dx x. dx E(XY) E(XX ) E(X 3 ) x 3 x x x.f (x) dx 4 E(XY) Cov(X,Y) r(x, Y) ρ(x, Y) σσ X Y ρ. Note : E(X) and E(XY) are equal to zero, noted not find σ x &σ y. Result Marginal pdf g Marginal pdf y F(y/x) 4x 3, <x< 4y, <y<x y, < y < x, < x < x 48

54 MA645 UNIT - III RANDOM PROCESSES Introduction In chapter, we discussed about random variables. Random variable is a function of the possible outcomes of a experiment. But, it does not include the concept of time. In the real situations, we come across so many time varying functions which are random in nature. In electrical and electronics engineering, we studied about signals. Generally, signals are classified into two types. (i) Deterministic (ii) Random Here both deterministic and random signals are functions of time. Hence it is possible for us to determine the value of a signal at any given time. But this is not possible in the case of a random signal, since uncertainty of some element is always associated with it. The probability model used for characterizing a random signal is called a random process or stochastic process. 3. RANDOM PROCESS CONCEPT A random process is a collection (ensemble) of real variable {X(s, t)} that are functions of a real variable t where s S, S is the sample space and t T. (T is an index set). REMARK i) If t is fixed, then {X(s, t)} is a random variable. ii) If S and t are fixed {X(s, t)} is a number. iii) If S is fixed, {X(s, t)} is a signal time function. NOTATION Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to be continuous process is denoted by {X(n)} or {Xn}. A comparison between random variable and random process Random Variable A function of the possible outcomes of an experiment is X(s) Outcome is mapped into a number x. Random Process A function of the possible outcomes of an experiment and also time i.e, X(s, t) Outcomes are mapped into wave from which is a fun of time 't'. 49

55 MA CLASSIFICATION OF RANDOM PROCESSES We can classify the random process according to the characteristics of time t and the random variable X X(t) t & x have values in the ranges < t < and < x <. Random Process is a function of Random Variables Time t Discrete Continuous Discrete Continuous 3.. CONTINUOUS RANDOM PROCESS If 'S' is continuous and t takes any value, then X(t) is a continuous random variable. Example Let X(t) Maximum temperature of a particular place in (, t). Here 'S' is a continuous set and t (takes all values), {X(t)} is a continuous random process. 3.. DISCRETE RANDOM PROCESS If 'S' assumes only discrete values and t is continuous then we call such random process {X(t) as Discrete Random Process. Example Let X(t) be the number of telephone calls received in the interval (, t). Here, S {,, 3, } T {t, t } {X(t)} is a discrete random process CONTINUOUS RANDOM SEQUENCE If 'S' is a continuous but time 't' takes only discrete is called discrete random sequence. Example: Let X n denote the outcome of the n th toss of a fair die. Here S {,, 3, 4, 5, 6} T {,, 3, } (X n, n,, 3, } is a discrete random sequence. 5

56 MA CLASSIFICATION OF RANDOM PROCESSES BASED ON ITS SAMPLE FUNCTIONS Non-Deterministic Process A Process is called non-deterministic process if the future values of any sample function cannot be predicted exactly from observed values. Deterministic Process A process is called deterministic if future value of any sample function can be predicted from past values STATIONARY PROCESS A random process is said to be stationary if its mean, variance, moments etc are constant. Other processes are called non stationary.. st Order Distribution Function of {X(t)} For a specific t, X(t) is a random variable as it was observed earlier. F(x, t) P{X(t) x} is called the first order distribution of the process {X(t)}. st Order Density Function of {X(t)} f ( x,t) F( x,t) is called the first order density of {X, t} x nd Order distribution function of {X(t)} F( x,x ;t,t) P{ X( t) x ;X( t) x} is the point distribution of the random variables X(t ) and X(t ) and is called the second order distribution of the process {X(t)}. nd order density function of {X(T)} F( x,x ;t,t) f( x,x ;t,t) is called the second order density of {X(t)}. x, x 3.3. First - Order Stationary Process Definition A random process is called stationary to order, one or first order stationary if its st order density function does not change with a shift in time origin. In other words, fx( x,t) fx( x,t+ C) must be true for any t and any real number C if {X(t )} is to be a first order stationary process. Example :3.3. Show that a first order stationary process has a constant mean. Solution Let us consider a random process {X(t )} at two different times t and t. 5

57 MA645 E X( t ) ( ) xf x, t dx ( ) ( ) Let t t + C [f(x,t ) is the density form of the random process X(t )] E X t xf x, t dx [f(x,t ) is the density form of the random process X(t )] ( ) ( + ) ( ) E X t xf x, t C dx xf x, t dx E X( t ) Thus E X( t ) E X( t ) Mean process {X(t)} mean of the random process {X(t )}. Definition : If the process is first order stationary, then Mean E(X(t)] constant Second Order Stationary Process A random process is said to be second order stationary, if the second order density function stationary. ( ) ( ) f x,x ;t,t f x,x ;t + C,t + C x,x and C. ( ) ( ) ( ) E X,E X,E X,X denote change with time, where X X(t ); X X(t ) Strongly Stationary Process A random process is called a strongly stationary process or Strict Sense Stationary Process (SSS Process) if all its finite dimensional distribution are invariance under translation of time 't'. f X (x, x ; t, t ) f X (x, x ; t +C, t +C) f X (x, x, x 3 ; t, t, t 3 ) f X (x, x, x 3 ; t +C, t +C, t 3 +C) In general f X (x, x..x n ; t, t t n ) f X (x, x..x n ; t +C, t +C..t n +C) for any t and any real number C Jointly - Stationary in the Strict Sense {X(t)} and Y{(t)} are said to be jointly stationary in the strict sense, if the joint distribution of X(t) and Y(t) are invariant under translation of time. Definition Mean: µ X( t) E X( t ), < t < µ X ( t) is also called mean function or ensemble average of the random process Auto Correlation of a Random Process 5

58 MA645 Let X(t ) and X(t ) be the two given numbers of the random process {X(t)}. The auto correlation is RXX ( t,t ) E{ X( t) xx( t )} Mean Square Value Putting t t t in (), we get R XX (t,t) E[X(t) X(t)] RXX ( t,t) E X ( t) is the mean square value of the random process Auto Covariance of A Random Process CXX ( t,t ) E{ X( t) E( X( t) ) } X( t ) E( X( t )) RXX ( t,t ) E X( t) E X ( t ) Correlation Coefficient The correlation coefficient of the random process {X(t)} is defined as CXX ( t,t) ρ XX ( t,t) Var X( t) xvar X( t ) Where C XX (t, t ) denotes the auto covariance. 3.4 CROSS CORRELATION The cross correlation of the two random process {X(t)} and {Y(t)} is defined by R XY (t, t ) E[X(t ) Y (t )] 3.5 WIDE - SENSE STATIONARY (WSS) A random process {X(t)} is called a weakly stationary process or covariance stationary process or wide-sense stationary process if i) E{X(t)} Constant ii) E[X(t) X(t+τ] R XX (τ) depend only on τ when τ t - t. REMARKS : SSS Process of order two is a WSS Process and not conversely. 3.6 EVOLUTIONARY PROCESS A random process that is not stationary in any sense is called as evolutionary process. SOLVED PROBLEMS ON WIDE SENSE STATIONARY PROCESS Example:3.6. Given an example of stationary random process and justify your claim. Solution: Let us consider a random process X(t) A as (wt + θ) where A &ω are custom and 'θ' is uniformlydistribution random Variable in the interval (, π). Since 'θ' is uniformly distributed in (, π), we have 53

59 MA645 f, < C < π θ π,otherwise ( ) E[X(t)] X( t) f( θ) π dθ ω( ω +θ) A t dθ π A sin ( t ) π ω +θ π A Sin ( t ) Sin ( t ) π π+ω ω + A [ Sin ω t sin ω t ] π constant Since E[X(t)] a constant, the process X(t) is a stationary random process. Example:3.6. which are not stationary Examine whether the Poisson process {X(t)} given by the probability law P{X(t)n] λt e ( λ t), n,,,. n Solution We know that the mean is given by E X ( t) npn ( t) n ( λt) n λt ne n n λt e ( λt) n n n ( ) n λt λt e n n ( ) t t t e λ λ λ !! 54

60 MA645 ( ) ( λt) t t t e λ λ... λ λt λt ( λ t) e e λ t, depends on t Hence Poisson process is not a stationary process. 3.7 ERGODIC RANDOM PROCESS Time Average The time average of a random process {X(t)} is defined as T XT X( t) dt T T Ensemble Average The ensemble average of a random process {X(t)} is the expected value of the random variable X at time t Ensemble Average E[X(t)] Ergodic Random Process {X(t)} is said to be mean Ergodic If lim X µ T T T lim X( t) dt T T µ T Mean Ergodic Theorem Let {X(t)} be a random process with constant mean µ and let Then {X(t)} is mean ergodic if lim Var X T T X T be its time average. Correlation Ergodic Process The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is mean ergodic where Y(t) X(t) X(t+λ) E y t lim Y Y is the time average of Y(t). ( ) T when T T 3.8 MARKOV PROCESS Definition A random process {X(t)} is said to be markovian if P X t + X + / X n + x, x t x...x t x P X ( tn+ ) X n+ /x( tn) xn Where t t t... t n t n + Examples of Markov Process ( ) ( ) ( ) ( ) n n n n n 55

61 MA645.The probability of raining today depends only on previous weather conditions existed for the last two days and not on past weather conditions..a different equation is markovian. Classification of Markov Process Markov Process Continuous Parameter Markov Process Discrete Parameter Markov Process Discrete Parameter Markov Chain Continuous Parameter Markov Chain 3.9 MARKOV CHAIN Definition We define the Markov Chain as follows If P{X n a n /X n- a n-, X n- a n-, X a } P{X n a n / X n- a n- } for all n. the process {X n }, n,, is called as Markov Chains..a, a, a3, an are called the states of the Markov Chain..The conditional probability P{X n aj X n ai} Pij ( n,n ) is called the one step transition probability from state a i to state a j at the n th step. 3.The tmp of a Markov chain is a stochastic matricx i) P ij ii) ΣP ij [Sum of elements of any row is ] 3. Poisson Process The Poisson Process is a continuous parameter discrete state process which is very useful model for many practical situations. It describe number of times occurred. When an experiment is conducted as a function of time. Property Law for the Poisson Process Let λ be the rate of occurrences or number of occurrences per unit time and P n (t) be the probability of n occurrences of the event in the interval (, t) is a Poisson distribution with parameter λt. λt e ( λt) n i.e. P X( t) n, n,,,... n! λt e ( λt) n Pn ( t) n! 56

62 MA645 Second Order Probability Function of a Homogeneous Poisson Process P X( t) n X( t) n P X( t n ).P X( t n ) / ( ) X t n,t > t P X t n.p [the even occurs n -n times in the interval (t t ) ( ) e ( λt ) ( ) n λ t t n { ( )} n e t t.,n n n n n λt λ n ( ) n λt n n e. λ.t t t,n n n, n n, otherwise 3.SEMI RANDOM TELEGRAPH SIGNAL PROCESS If N(t) represents the number of occurrence of a specified event in (, t) and X(t) ( ) N(t), then {X(t)} is called a semi-random telegraph signal process. 3.. RANDOM TELEGRAPH SIGNAL PROCESS Definition A random telegraph process is a discrete random process X(t) satisfying the following: i. X(t) assumes only one of the two possible values or at any time 't' ii. X() or with equal probability / iii. The number of occurrence N(t) from one value to another occurring in any interval of length 't' is a Poisson process with rate λ, so that the probability of exactly 'r' transitions is λt e ( λt) r P N( t) r, r,,,... r! A typical sample function of telegraph process. (,) (,-) Note: The process is an example for a discrete random process. * Mean and Auto Correlation P{X(t) } and P{X(t) " for any t. 57

63 MA645 TUTORIAL QUESTIONS.. The t.p.m of a Marko cain with three states,, is P and the initial state distribution is Find (i)p[x 3] ii)p[x 3, X 3, X 3, X ]. Three boys A, B, C are throwing a ball each other. A always throws the ball to B and B always throws the ball to C, but C is just as likely to throw the ball to B as to A. S.T. the process is Markovian. Find the transition matrix and classify the states 3. A housewife buys 3 kinds of cereals A, B, C. She never buys the same cereal in successive weeks. If she buys cereal A, the next week she buys cereal B. However if she buys P or C the next week she is 3 times as likely to buy A as the other cereal. How often she buys each of the cereals? 4. A man either drives a car or catches a train to go to office each day. He never goes days in a row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of week, the man tossed a fair die and drove to work if a 6 appeared. Find ) the probability that he takes a train on the 3 rd day. ). The probability that he drives to work in the long run. WORKED OUT EXAMPLES Example:.Let X n denote the outcome of the n th toss of a fair die. Here S {,, 3, 4, 5, 6} T {,, 3, } (X n, n,, 3, } is a discrete random sequence. Example: Given an example of stationary random process and justify your claim. Solution: Let us consider a random process X(t) A as (wt + θ) where A &ω are custom and 'θ' is uniformly distribution random Variable in the interval (, π). Since 'θ' is uniformly distributed in (, π), we have, < C < π f ( θ ) π,otherwise E[X(t)] X( t) f( θ) dθ 58

64 MA645 π ω( ω +θ) A t dθ π A sin ( t ) π ω +θ π A Sin ( t ) Sin ( t ) π π+ω ω + A [ Sin ω t sin ω t ] π constant Since E[X(t)] a constant, the process X(t) is a stationary random process. Example:3.which are not stationary.examine whether the Poisson process {X(t)} given by the λt e ( λ t) probability law P{X(t)n], n,,,. n Solution We know that the mean is given by ( ) ( ) E X t npn t n ( λt) n λt ne n n λt e ( λt) n n n ( ) n λt λt e n n ( ) t t t e λ λ λ !! ( ) ( λt) t t t e λ λ... λ λt λt ( λ t) e e λ t, depends on t Hence Poisson process is not a stationary process. 59

65 MA645 UNIT - 4 CORRELATION AND SPECTRAL DENSITY Introduction The power spectrum of a time series x(t) describes how the variance of the data x(t) is distributed over the frequency components into which x(t) may be decomposed. This distribution of the variance may be described either by a measure µ or by a statistical cumulative distribution function S(f) the power contributed by frequencies from upto f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the spectral distribution function of x. The spectral density at a frequency f gives the rate of variance contributed by frequencies in the immediate neighbourhood of f to the variance of x per unit frequency. 4. Auto Correlation of a Random Process Let X(t ) and X(t ) be the two given random variables. Then auto correlation is R XX (t, t ) E[X(t ) X(t )] Mean Square Value Putting t t t in () R XX (t, t) E[X(t) X(t)] RXX (t, t) E[X (t)] Which is called the mean square value of the random process. Auto Correlation Function Definition: Auto Correlation Function of the random process {X(t)} is R XX (τ) E{(t) X(t+τ)} Note: R XX (τ) R(τ) R X (τ) PROPERTY: The mean square value of the Random process may be obtained from the auto correlation function. R XX (τ), by putting τ. is known as Average power of the random process {X(t)}. PROPERTY: R XX (τ) is an even function of τ. R XX (τ) R XX (-τ) PROPERTY: 3 If the process X(t) contains a periodic component of the same period. PROPERTY: 4 If a random process {X(t)} has no periodic components, and E[X(t)] X then 6

66 MA645 T XX ( τ ) ( τ ) lim R X (or)x lim R T XX i.e., when τ, the auto correlation function represents the square of the mean of the random process. PROPERTY: 5 The auto correlation function of a random process cannot have an arbitrary shape. SOLVED PROBLEMS ON AUTO CORRELATION Example : Check whether the following function are valid auto correlation function (i) 5 sin nπ (ii) + 9τ Solution: (i) Given R XX (τ) 5 Sin nπ R XX ( τ) 5 Sin n( π) 5 Sin nπ R XX (τ) R XX ( τ), the given function is not an auto correlation function. (ii) Given R XX (τ) + 9τ R XX ( τ) R XX ( τ) + 9( τ) The given function is an auto correlation function. Example : Find the mean and variance of a stationary random process whose auto correlation function is given by R XX ( τ ) τ Solution Given R XX ( τ ) τ ( ) X lim R τ τ XX lim τ 8 + lim τ 6 +τ τ 6

67 MA X 8 ( ) E X t 8 Var {X(t)} E[X (t)] - {E[X(t)]} We know that E X ( t) R XX () Example : 3 Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation function of process {X(t)} Solution Consider, R XX '(t, t ) E{X(t )X'(t )} X( t + h) X( t) E X( t) lim n h X( t) X( t + h) X( t) X( t) lim E h h RXX( t,t + h) RX( t,t) lim h h R XX ' (t, t ) R XX ( t, t ) t () Similarly R XX ' (t, t ) R XX '( t,t ) t R X ' X (t, t ) R XX ( t, t ) t, t by () Auto Covariance The auto covariance of the process {X(t)} denoted by C XX (t, t ) or C(t, t ) is defined as 6

68 MA645 { ( ) ( ( ) ) } ( ) ( ) ( ) CXX t,t E X t E X t X t E X t 4. CORRELATION COEFFICIENT CXX ( t,t) ρ XX ( t,t) Var X t x Var X t ( ) ( ) Where C XX (t, t ) denotes the auto covariance. 4.3 CROSS CORRELATION Cross correlation between the two random process {X(t)} and {Y(t)} is defined as R XY (t, t ) E[X(t ) Y(t )] where X(t ) Y(t ) are random variables. 4.4 CROSS COVARIANCE Let {X(t)} and {Y(t)} be any two random process. Then the cross covariance is defined as CXY ( t,t ) E{ X( t) E( Y( t) ) X( t E Y( t ) ) } The relation between Mean Cross Correlation and cross covariance is as follows: CXY ( t,t) RXY ( t,t) E X( t) E Y ( t ) Definition Two random process {X(t)} and {Y(t)} are said to be uncorrelated if CXY( t,t), t,t Hence from the above remark we have, R XY (t, t ) E[X(t ) Y(t )] 4.4. CROSS CORRELATION COEFFICIENT cxy ( t,t) ρ XY ( t,t) Var X t Var X t ( ) ( ( ) ) ( ) 4.4. CROSS CORRELATION AND ITS PROPERTIES Let {X(t)} and {Y(t)} be two random. Then the cross correlation between them is also defined as R XY (t, t+τ) E X( t) Y( t+τ) R XY (τ) PROPERTY : R XY (τ) R YX ( τ) PROPERTY : If {X(t)} and {Y(t)} are two random process then R ( τ) R ( ) R ( ) R XX (τ) and R YY (τ) are their respective auto correlation functions. 63 XY XX YY, where

69 MA645 PROPERTY : 3 If {X(t)} and {Y(t)} are two random process then, R ( τ ) XY RXX ( ) RYY ( + ) SOLVED PROBLEMS ON CROSS CORRELATION Example:4.4. Two random process {X(t)} and {Y(t)} are given by X(t) A cos (ωt+θ), Y(t) A sin (ωt + θ) where A and ω are constants and 'θ' is a uniform random variable over to π. Find the cross correlation function. Solution By def. we have R XY (τ) R XY (t, t+τ) Now, R XY (t, t+τ) E[X(t). Y(t+τ)] E [A cos (ωt + θ). A sin (ω (t+τ) + θ)] A E sin ω t +τ +θ cos ω t +θ ( ) { } ( ) Since 'θ' is a uniformly distributed random variable we have f(), θ π π Now E sin{ ω ( t +τ ) +θ} cos( ω t +θ) sin ω t +ωτ+θ.cos wt +θ f θ dθ ( ) ( ) ( ) π +θ sin ω t +ω.cos ω t +θ dθ π π sin ω t +ωτ+θ cos ω t +θ dθ π t ( ) ( ) ( ) ( ) π { sin t t π ( ω +ωτ+θ+ω +θ) [ ]} + sin ω t +ωτ+θ ωt θ dθ [ ω + ωτ + θ ] + ( ωτ) π sin t sin π dθ 64

70 MA645 ( ) cos ω t + ωτ + θ + sin ωτ( θ) 4π π ( ) ( ) cos ω t + ωτ cos ω t + ωτ sin ωτ( π ) 4π cos( ω t + ωτ) cos( ω t + ωτ) + + πsin ωτ 4π + π sin ωτ 4π [ ] sin ωτ (3) Substituting (3) in () we get A R XY ( t, t τ ) sin ωτ 4.5 SPECTRAL DENSITIES (POWER SPECTRAL DENSITY) INTRODUCTION (i) Fourier Transformation (ii) Inverse Fourier Transform (iii) Properties of Auto Correlation Function (iv) Basic Trigonometric Formula (v) Basic Integration 4.5. SPECIAL REPRESENTATION Let x(t) be a deterministic signal. The Fourier transform of x(t) is defined as ω i t ( ) ( ) ( ) F x t x w x t e dt Here X(ω) is called "spectrum of x(t)". Hence x(t) Inverse Fourier Transform of X(ω) X( ) e i ω ω t dω π. Definition The average power P(T) of x(t) over the interval (-T, T) is given by T P( T) x ( t) dt T T 65

71 MA645 X T ( ω) dω π () T Definition The average power PXX for the random process {X(t)} is given by T PXX lim E X ( t) dt T π T T T ( ω) E X T lim d ω π () 4.6 POWER SPECTRAL DENSITY FUNCTION Definition If {X(t)} is a stationary process (either in the strict sense or wide sense) with auto correlation function R XX (τ), then the Fourier transform of R XX (τ) is called the power spectral density function of {X(t)} and is denoted by S XX (ω) or S(ω) or S X (ω). S XX (ω) Fourier Transform of R XX (τ) Thus, XX ωτ R τ e dτ ( ) i XX i f ( ) ( ) πτ S f R τ e dτ XX 4.6. WIENER KHINCHINE RELATION XX ( ) ( ) i ωτ S ω R τ e dτ XX XX i f ( ) ( ) πτ S f R τ e dτ XX To find R XX (τ) if S XX (ω) or S XX (f) is given ( ) ( ) i ωτ RXX τ SXX ω e dω π [inverse Fourier transform of S XX (ω)] iπτ f (or) RXX ( τ ) SXX ( f) e dτ π [inverse Fourier transform of S XX (f)] 4.7 PROPERTIES OF POWER SPECTRAL DENSITY FUNCTION Property : 66

72 MA645 The value of the spectral density function at zero frequency is equal to the total area under the group of the auto correlation function. XX iπfc ( ) ( ) S f R τ e dτ Taking f, we get XX Sxx() R ( τ) XX dτ TUTORIAL QUESTIONS. Find the ACF of {Y(t)} AX(t)cos (w + ) where X(t) is a zero mean stationary random process with ACF A and w are constants and is uniformly distributed over (, ) and independent of X(t).. Find the ACF of the periodic time function X(t) A sinwt 3.If X(t) is a WSS process and if Y(t) X(t + a) X(t a), prove that 4. If X(t) A sin( ), where A and are constants and is a random variable, uniformly distributed over (- ), Find the A.C.F of {Y(t)} where Y(t) X (t). 5.. Let X(t) and Y(t) be defined by X(t) Acos t + Bsin t and Y(t) B cos t Asin t Where is a constant and A nd B are independent random variables both having zero mean and varaince. Find the cross correlation of X(t) and Y(t). Are X(t) and Y(t) jointly W.S.S processes? 6. Two random processes X(t) and Y(t) are given by X(t) A cos ( ), Y(t) A sin( ), where A and are constants and is uniformly distributed over (, ). Find the cross correlation of X(t) and Y(t) and verify that. 7..If U(t) X cos t + Y sin t and V(t) Y cost + X sint t where X and Y are independent random varables such that E(X) E(Y), E[X ] E[Y ], show that U(t) and V(t) are not jointly W.S.S but they are individually stationary in the wide sense. 8. Random Prosesses X(t) and Y(t) are defined by X(t) A cos ( ), Y(t) B cos ( ) where A, B and are constants and is uniformly distributed over (, ). Find the cross correlation and show that X(t) and Y(t) are jointly W.S.S WORKEDOUT EXAMPLES Example.Check whether the following function are valid auto correlation function (i) 5 sin nπ (ii) + 9τ Solution: (i) Given R XX (τ) 5 Sin nπ R XX ( τ) 5 Sin n( π) 5 Sin nπ 67

73 MA645 R XX (τ) R XX ( τ), the given function is not an auto correlation function. (ii) Given R XX (τ) + 9τ R XX ( τ) R XX ( τ) + 9( τ) The given function is an auto correlation function. Example : Find the mean and variance of a stationary random process whose auto correlation function is given by R XX ( τ ) τ Solution Given R XX ( ) 8 6 τ + +τ ( ) X lim R τ τ XX τ lim τ 8 + lim τ 6 +τ ( ) X 8 E X t 8 Var {X(t)} E[X (t)] - {E[X(t)]} We know that E X ( t) R XX () Example : 3 68

74 MA645 Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation function of process {X(t)} Solution Consider, R XX '(t, t ) E{X(t )X'(t )} X( t + h) X( t) E X( t) lim n h X( t) X( t + h) X( t) X( t) lim E h h RXX( t,t + h) RX( t,t) lim h h R XX ' (t, t ) R XX ( t, t ) t () Similarly R XX ' (t, t ) R XX '( t,t ) t R X ' X (t, t ) R XX ( t, t ) t, t by () Example :4 Two random process {X(t)} and {Y(t)} are given by X(t) A cos (ωt+θ), Y(t) A sin (ωt + θ) where A and ω are constants and 'θ' is a uniform random variable over to π. Find the cross correlation function. Solution By def. we have R XY (τ) R XY (t, t+τ) Now, R XY (t, t+τ) E[X(t). Y(t+τ)] E [A cos (ωt + θ). A sin (ω (t+τ) + θ)] A E sin ω t +τ +θ cos ω t +θ ( ) { } ( ) Since 'θ' is a uniformly distributed random variable we have f(), θ π π Now E sin{ ω ( t +τ ) +θ} cos( ω t +θ) sin ω t +ωτ+θ.cos wt +θ f θ dθ ( ) ( ) ( ) 69

75 MA645 π t ( ) ( ) +θ sin ω t +ω.cos ω t +θ dθ π π sin ω t +ωτ+θ cos ω t +θ dθ π ( ) ( ) π { sin t t π ( ω +ωτ+θ+ω +θ) [ ]} + sin ω t +ωτ+θ ωt θ dθ [ ω + ωτ + θ ] + ( ωτ) π sin t sin π ( ) dθ cos ω t + ωτ + θ + sin ωτ( θ) 4π π ( ) ( ) cos ω t + ωτ cos ω t + ωτ sin ωτ( π ) 4π cos( ω t + ωτ) cos( ω t + ωτ) + + πsin ωτ 4π + π sin ωτ 4π [ ] sin ωτ (3) Substituting (3) in () we get A R XY ( t, t τ ) sin ωτ 7

76 MA645 UNIT 5 LINEAR SYSTEM WITH RANDOM INPUTS Introduction Mathematically a "system" is a functional relationship between the input x(t) and y(t). We can write the relationship as y(f) f[x(t): < + <] Let x(t) represents a sample function of a random process {X(t)}. Suppose the system produces an output or response y(f) and the ensemble of the output functions forms a random process {Y(t)}. Then the process {Y(t)} can be considered as the output of the system or transformation 'f' with {X(t)} as the input and the system is completely specified by the operator "f". 5. LINEAR TIME INVARIANT SYSTEM Mathematically a "system" is a functional relationship between the input x(t) and output y(t). we can write the relationship y( t) f x( t ) : < t < 5. CLASSIFICATION OF SYSTEM. Linear System: f is called a linear system, if it satisfies f a X t ± a x t a f X t ± a f X t ( ) ( ) ( ) ( ). Time Invariant System: Let Y(t) f[x(t)] Y t+ h f X t+ h If ( ) ( ), then f is called a time invariant system or X(t) and Y(t) are said to form a time invariant system. 3. Causal System: Suppose the value of the output Y(t) at t t depends only on the past values of the input X(t), t t. Y t f X t :t t, then such a system is called a causal In other words, if ( ) ( ) system. 4. Memory less System: If the output Y(t) at a given time t t depends only on X(t ) and not on any other past or future values of X(t), then the system f is called memory less system. 5. Stable System: A linear time invariant system is said to be stable if its response to any bounded input is bounded. REMARK: i) Noted that when we write X(t) we mean X(s,t) where s S, S is the sample space. If the system operator only on the variable t treating S as a parameter, it is called a deterministic system. Input X t ( ) Output Y( t) Linear System 7

77 MA645 h(t) (a) Input X t ( ) Output Y( t) LTI System h(t) (b) a) Shows a general single input - output linear system b) Shows a linear time invariant system 5.3 REPRESENTATION OF SYSTEM IN THE FORM OF CONVOLUTION Y t h t xx t ( ) ( ) ( ) ( ) ( ) ( ) Y t h u X t u du ( ) ( ) h t u X u du 5.4 UNIT IMPULSE RESPONSE TO THE SYSTEM If the input of the system is the unit impulse function, then the output or response is the system weighting function. Y(t) h(t) Which is the system weight function PROPERTIES OF LINEAR SYSTEMS WITH RANDOM INPUT Property : If the input X(t) and its output Y(t) are related by ( ) ( ) ( ) Y t h u X t u du, then the system is a linear time - invariant system. Property : If the input to a time - invariant, stable linear system is a WSS process, then the output will also be a WSS process, i.e To show that if {X(t)} is a WSS process then the output {Y(t)} is a WSS process. Property 3: If {X(t)} is a WSS process and if Y(t) h( u) X( t ) ( τ ) ( τ) ( τ ) RXY RXX xh Property 4 : 7 u du, then If {(X(t)} is a WSS process and if Y(t) h( u) X( t ) ( τ ) ( τ) ( τ ) R R xh YY XY u du, then

78 MA645 Property 5: YY If {X(t)} is a WSS process and if Y(t) h( u) X( t ) ( τ ) ( τ) ( τ) ( τ ) R R xh xh Property 6:. Property 7: XX 73 u du, then If {X(t)} is a WSS process if Y(t) h( u) X( t u) du, then SXY ( ω ) SXX ( ω) H( ω) If {X(t)} is a WSS process and if Y(t) h( u) X( t ) ( ω ) ( ω) ( ω ) SYY SXX H Note: Instead of taking RXY ( ) E X( t) Y( t ) with RXY ( τ ) E X( t τ) Y( t) a) R XY ( τ ) R XY ( τ) xh ( τ ) b) R YY ( τ ) R XY ( τ) xh ( τ ) c) R ( τ ) R ( τ) xh( τ) xh( τ ) YY REMARK : XX u du, then τ +τ in properties (3), (4) & (5), if we start, then the above property can also stated as (i) We have written H( ω) H* ( ω ) H( ω ) because H( ω ) F h( τ ) H* ( ω ) F h ( τ) Fh ( ( τ) ) H( ω ) (ii) Equation (c) gives a relationship between the spectral densities of the input and output process in the system. (iii) System transfer function: We call H( ω ) Fh { ( τ )} as the power transfer function or system transfer function. SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT AND OUTPUT Example :5.4. Find the power spectral density of the random telegraph signal. Solution We know, the auto correlation of the telegraph signal process X(y) is

79 MA645 RXX ( ) τ e λτ The power spectral density is XX ( ) ( ) i ωτ S ω R τ e dτ XX λτ iωτ λτ iωτ e e dτ+ e e dτ λ iω τ λτ iωτ ( ) e dτ+ e e dτ λ iω τ λ+ iω τ ( ) τ τ when τ < τ τ when τ > ( ) e e + ( λ iω) ( λ+ iω) e e e e λ iω λ+ iω ( ) ( ) λ iω i ( ) ( ) ( ) ( ) λ+ ω + e e λ iω λ+ iω ( ) ( ) λ iω ( ) ( ) + λ+ iω+ λ iω ( λ iω )( λ+ iω ) 4 S ( ω λ ) 4 λ +ω Example : 5.4. XX βt A linear time invariant system has a impulse response h( t) e U( t) spectral density of the output Y(t) corresponding to the input X(t). Solution: Given X(t) - Input Y(t) - output S YY (ω) - H(ω) S XX (ω). Find the power 74

80 MA645 ω H h t e dt ω Now ( ) ( ) i t ω i t βt ω i t h t e dt + e e dt ( ) + e ( i ) β+ ω t ( i ) dt β+ ω t e ( β+ iω) e β+ iω H(ω) ( ) ( e ) β+ ω ( ) ( e e ) i β+ iω β+ iω β +ω S ( ω ) S ( ω) YY β +ω XX TUTORIAL QUESTIONS. State and Prove Power spectral density of system response theorem.. Suppose that X(t) is the input to an LTI system impulse response h (t) and that Y(t) is the input to another LTI system with impulse response h (t). It is assumed that X(t) and Y(t) are jointly wide sense stationary. Let V(t) and Z(t) denote that random processes at the respective system outputs. Find the cross correlation of X(t) and Y(t). 3. The input to the RC filter is a white noise process with ACF. If the frequency response find the auto correlation and the mean square value of the output process Y(t). 4. A random process X(t having ACF, where P and are real positive λe, t > constants, is applied to the input of the system with impulse response h(t) where, t < 75

81 MA645 λ is a real positive constant. Find the ACF of the network s response Y(t). Find the cross correlation. WORKEDOUT EXAMPLES Example: Find the power spectral density of the random telegraph signal. Solution We know, the auto correlation of the telegraph signal process X(y) is RXX ( τ ) e λτ The power spectral density is XX ( ) ( ) i ωτ S ω R τ e dτ S XX λτ iωτ λτ iωτ e e dτ+ e e dτ λ iω τ λτ iωτ ( ) e dτ+ e e dτ λ iω τ λ+ iω τ ( ) τ τ when τ < τ τ when τ > ( ) e e + ( λ iω) ( λ+ iω) e e e e ( λ iω ) ( λ+ iω ) ( ) ( ) ( ) ( ) λ iω λ+ iω + e e ( λ iω ) ( λ+ iω ) ( ) + ( λ iω ) λ+ iω+ λ iω ( λ iω )( λ+ iω ) ω 4λ 4 λ +ω ( ) XX 76

82 Probability & Random Process Formulas UNIT-I (RANDOM VARIABLES) ) Discrete random variable: A random variable whose set of possible values is either finite or countably infinite is called discrete random variable. Eg: (i) Let X represent the sum of the numbers on the dice, when two dice are thrown. In this case the random variable X takes the values, 3, 4, 5, 6, 7, 8, 9,, and. So X is a discrete random variable. (ii) Number of transmitted bits received in error. ) Continuous random variable: A random variable X is said to be continuous if it takes all possible values between certain limits. Eg: The length of time during which a vacuum tube installed in a circuit functions is a continuous random variable, number of scratches on a surface, proportion of defective parts among tested, number of transmitted in error. 3) Sl.No. Discrete random variable Continuous random variable p ( xi ) f ( x ) dx i F ( x) P [ X x] 3 E [ X ] Mean x p ( x ) 4 E X i i i i i [ ] F ( x) P X x f ( x ) dx E [ X ] x Mean xf ( x ) dx x p ( xi ) E X x f ( x ) dx 5 Var ( X ) E ( X ) E ( X ) Var ( X ) E ( X ) E ( X ) 6 r r Moment E X x i pi 7 M.G.F M.G.F i r r Moment E X x f ( x ) dx

83 tx tx ( ) ( ) M X t E e e p x x tx tx ( ) ( ) M X t E e e f x dx 4) E ( ax + b) ae ( X ) + b 5) Var ( ax + b) a Var ( X ) 6) Var ( ax ± by ) a Var ( X ) + b Var ( Y ) 7) Standard Deviation Var ( X ) 8) f ( x) F ( x) 9) p ( X > a) p ( X a) p ( A B) ) p ( A / B), p ( B) p ( B) ) If A and B are independent, then p ( A B) p ( A) p ( B) ) st Moment about origin E [ X ] M ( t ) nd Moment about origin r t E X. X t M ( ) X t t (Mean) r The co-efficient of E X r! (rth Moment about the origin) 3) Limitation of M.G.F: i) A random variable X may have no moments although its m.g.f exists. ii) A random variable X can have its m.g.f and some or all moments, yet the m.g.f does not generate the moments. iii) A random variable X can have all or some moments, but m.g.f does not exist except perhaps at one point. 4) Properties of M.G.F: bt i) If Y ax + b, then MY ( t ) e M X ( at ) ii) M ( t ) M ( ct ) cx X., where c is constant. iii) If X and Y are two independent random variables then M X + Y ( t ) M X ( t ) MY ( t ). 5) P.D.F, M.G.F, Mean and Variance of all the distributions: Sl. Distributio P.D.F ( P ( X x) No. n Binomial x n x nc x p q Poisson 3 Geometric ) M.G.F Mean Variance t ( q + pe ) n x e t λ ( e ) e λ λ λ x! x q x p (or) q p t pe q t qe p p np npq

84 4 Uniform 5 Exponential, a < x < b f ( x) b a, otherwise λ x λ e, x >, λ > f ( x), otherwise e x f ( x), < x <, λ > Γ( λ) 6 Gamma x λ 7 Normal x µ σ f ( x) e σ π bt at e e a b ( b a ) t λ λ t ( t) λ e λ λ + ( b a) λ λ σ t σ µ µ t + 6) Memoryless property of exponential distribution P X > S + t / X > S P X > t. ( ) ( ) dx 7) Function of random variable: fy ( y) f X ( x) dy UNIT-II (RANDOM VARIABLES) ) p ij (Discrete random variable) i j f ( x, y ) dxdy (Continuous random variable) ) Conditional probability function X given Y P { X x / Y y } (, ) P x y i i. P ( y) Conditional probability function Y given X P { Y y / X x } ( <, < ) P X a Y b P { X < a / Y < b} P ( Y < b) (, ) P x y i i. P ( x) 3) Conditional density function of X given Y, Conditional density function of Y given X, f ( x / y) f ( x, y). f ( y) f ( y / x) f ( x, y). f ( x) 4) If X and Y are independent random variables then f ( x, y) f ( x). f ( y) (for continuous random variable)

85 (, ) ( ). ( ) P X x Y y P X x P Y y (for discrete random variable) d b P a X b, c Y d f ( x, y ) dxdy 5) Joint probability density function ( ) b a P X < a, Y < b f ( x, y ) dxdy ( ) < < 6) Marginal density function of X, f ( x) f ( x) f ( x, y ) dy X Y Marginal density function of Y, f ( y) f ( y) f ( x, y ) dx 7) P ( X + Y ) P ( X + Y < ). c a 8) Correlation co efficient (Discrete): Cov ( X, Y ) ρ( x, y) σ σ X Y Cov ( X, Y ) XY XY, n σ X X X n, σ Y Y Y n Cov ( X, Y ) 9) Correlation co efficient (Continuous): ρ( x, y) σ σ ( ) ( ) ( ) Cov ( X, Y ) E X, Y E X E Y, σ Var ( X ), σ Var ( Y ) ) If X and Y are uncorrelated random variables, then Cov ( X, Y ). E X xf x dx, E ( Y ) yf ( y ) dy ) ( ) ( ) ) Regression for Discrete random variable: Regression line X on Y is x x b ( y y) X, E ( X, Y ) xyf ( x, y ) dxdy., Regression line Y on X is y y b ( x x) xy, yx X Y Y ( )( ) ( y y) x x y y b xy ( )( ) ( x x) x x y y b yx Correlation through the regression, ρ ± b. b Note: ρ ( x, y) r ( x, y) XY YX

86 3) Regression for Continuous random variable: x xy, bxy r σ σ Regression line X on Y is x E ( x) b ( y E ( y) ) y Regression line Y on X is y E ( y) b ( x E ( x) ), yx b yx y r σ σ x x E x y x f x y dx Regression curve X on Y is ( / ) ( / ) Regression curve Y on X is ( / ) ( / ) 4) Transformation Random Variables: y E y x y f y x dy dx fy ( y) f X ( x) (One dimensional random variable) dy f ( u, v) f ( x, y) UV XY u x v x u y v y (Two dimensional random variable) 5) Central limit theorem (Liapounoff s form) If X, X, X n be a sequence of independent R.Vs with E[X i ] µ i and Var(X i ) σ i, i,, n and if S n X + X + + X n then under certain general conditions, S n n follows a normal distribution with mean µ µ and variance σ as i i n σ i i n. 6) Central limit theorem (Lindberg Levy s form) If X, X, X n be a sequence of independent identically distributed R.Vs with E[X i ] µ i and Var(X i ) σ i, i,, n and if S n X + X + + X n then under certain general conditions, S n follows a normal distribution with mean nµ and variance. nσ as n Sn nµ Note: z ( for n variables), n σ X µ z ( for single variables) σ n

87 UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS) ) Random Process: A random process is a collection of random variables {X(s,t)} that are functions of a real variable, namely time t where s Є S and t Є T. ) Classification of Random Processes: We can classify the random process according to the characteristics of time t and the random variable X. We shall consider only four cases based on t and X having values in the ranges -< t < and - < x <. Continuous random process Continuous random sequence Discrete random process Discrete random sequence Continuous random process: If X and t are continuous, then we call X(t), a Continuous Random Process. Example: If X(t) represents the maximum temperature at a place in the interval (,t), {X(t)} is a Continuous Random Process. Continuous Random Sequence: A random process for which X is continuous but time takes only discrete values is called a Continuous Random Sequence. Example: If X n represents the temperature at the end of the nth hour of a day, then {X n, n 4} is a Continuous Random Sequence. Discrete Random Process: If X assumes only discrete values and t is continuous, then we call such random process {X(t)} as Discrete Random Process. Example: If X(t) represents the number of telephone calls received in the interval (,t) the {X(t)} is a discrete random process since S {,,,3,... } Discrete Random Sequence: A random process in which both the random variable and time are discrete is called Discrete Random Sequence. Example: If X n represents the outcome of the nth toss of a fair die, the {X n : n } is a discrete random sequence. Since T {,,3,... } and S {,,3,4,5,6}

88 3) Condition for Stationary Process: E [ X ( t )] Constant, [ ] If the process is not stationary then it is called evolutionary. Var X ( t ) constant. 4) Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary: A random process is said to be WSS or Covariance Stationary if it satisfies the following conditions. E X ( t ) constant. i) The mean of the process is constant (i.e) ( ) ii) Auto correlation function depends only on τ (i.e) R ( XX τ ) E [ X ( t ). X ( t + τ ) ] 5) Time average: The time average of a random process { X ( t) } is defined as If the interval is (,T ), then the time average is 6) Ergodic Process: A random process { X ( t) } XT T X T X ( t ) dt T. T X ( t ) dt. T T is called ergodic if all its ensemble averages are interchangeable with the corresponding time average 7) Mean ergodic: ( ) Let { X t } be a random process with mean [ ( )] then { X ( t )} is said to be mean ergodic if X T µ E [ X t ] Lt X. ( ) T T Note: Lt ( XT ) T var (by mean ergodic theorem) 8) Correlation ergodic process: X ( t) The stationary process { } { ( )} Y t is mean ergodic where Y ( t) X ( t ) X ( t + τ ) Where YT is the time average of Y ( t ). 9) Auto covariance function: C ( τ ) R ( τ ) E X ( t) E X ( t + τ ) XX ) Mean and variance of time average: Mean: [ ( )] Variance: XX T X T. E X t µ and time average X T, as T (i.e) is said to be correlation ergodic if the process +. (i.e) E ( Y t ) ( ) ( ) E XT E X t dt T T Var X R ( τ ) C ( τ ) dτ T T XX XX T ( ) Lt YT. T

89 ) Markov process: A random process in which the future value depends only on the present value and not on the past values, is called a markov process. It is symbolically represented by P X ( tn + ) xn + / X ( tn ) xn, X ( tn ) xn... X ( t ) x P X ( tn ) xn / X ( tn ) x + + n Where t t t... tn t n + ) Markov Chain: If for all n, P X n an / X n an, X n an,... X a P X n an / X n an then the process { } n X, n,,,... is called the markov chain. Where a, a, a,... a,... are called the states of the markov chain. n 3) Transition Probability Matrix (tpm): When the Markov Chain is homogenous, the one step transition probability is denoted by P ij. The matrix P {P ij } is called transition probability matrix. 4) Chapman Kolmogorov theorem: If P is the tpm of a homogeneous Markov chain, then the n step tpm P (n) is equal to P n. (i.e) P ( n) ij n P ij. Π Π, Π, Π, then Π P Π and 5) Markov Chain property: If ( ) Π + Π + Π ) Poisson process: If X ( t) represents the number of occurrences of a certain event in (, t ),then the discrete random process { X ( t) } is called the Poisson process, provided the following postulates are satisfied. (i) P [ occurrence in ( t, t + t) ] λ t + O ( t ) (ii) P [ occurrence in ( t, t + t) ] λ t + O ( t ) (iii) P [ or more occurrences in ( t, t + t) ] O ( t ) (iv) X ( t ) is independent of the number of occurrences of the event in any interval. 7) Probability law of Poisson process: { } Mean E [ X ( t) ] λt, ( λt ) λt e P X ( t) x, x,,,... x! E X ( t) λ t + λ t, Var [ X ( t) ] λt. UNIT-IV (CORRELATION AND SPECTRAL DENSITY) RXX ( ) τ - Auto correlation function x

90 SXX RXY SXY ( ) ω - Power spectral density (or) Spectral density ( ) τ - Cross correlation function ( ) ω - Cross power spectral density ) Auto correlation to Power spectral density (spectral density): XX i ( ) ( ) S ω R τ e ωτ d τ ) Power spectral density to Auto correlation: XX i RXX ( τ ) SXX ( ω ) e ωτ d ω π 3) Condition for X ( t ) and X ( t + τ ) are uncorrelated random process is [ ] [ ] C ( τ ) R ( τ ) E X ( t) E X ( t + τ ) XX XX 4) Cross power spectrum to Cross correlation: i RXY ( τ ) SXY ( ω ) e ωτ d ω π 5) General formula: ax e a + b ax i) e cos bx dx ( a cos bx + bsin bx) ax e a + b ax ii) e sin bx dx ( a sin bx bcos bx) iii) iv) x ax x a a e sinθ i θ e i i θ e v) cosθ iθ + e i θ

91 UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS) ) Linear system: f is called a linear system if it satisfies ( ) ( ) f a X( t) ± a X ( t) a f X( t) ± a f X ( t) ) Time invariant system: Let Y ( t) f ( X ( t) ). If Y ( t + h) f ( X ( t + h) ) invariant system. 3) Relation between input X ( t) and output Y ( t ): Y ( t) h ( u ) X ( t u ) du Where h ( u ) system weighting function. + + then f is called a time 4) Relation between power spectrum of X ( t) and output Y ( t ): S ( ω ) S ( ω ) H( ω ) YY XX jωt If H( ω ) is not given use the following formula H( ω) e h ( t ) dt 5) Contour integral: imx e π m a e (One of the result) a + x a 6) a e F a + ω a τ (from the Fourier transform)

92 UNIT I: PROBABILITY AND RANDOM VARIABLES PART B QUESTIONS. A random variable X has the following probability distribution X P(X) K k k 3k k k 7k +k Find (i) The value of k, (ii) P[.5 < X < 4.5 / X > ] and (iii) The smallest value of λ for which P(X λ) < (/).. A bag contains 5 balls and its not known how many of them are white. Two balls are drawn at random from the bag and they are noted to be white. What is the chance that the balls in the bag all are white. x 3. Let the random variable X have the PDF f(x) e, x > Find the moment generating function, mean and variance. 4. A die is tossed until 6 appear. What is the probability that it must tossed more than 4 times. 5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He gets Rs. for each white ball and Rs 5 for each black ball. Find his expectation. 6. In a certain binary communication channel, owing to noise, the probability that a transmitted zero is received as zero is.95 and the probability that a transmitted one is received as one is.9. If the probability that a zero is transmitted is.4, find the probability that (i) a one is received (ii) a one was transmitted given that one was received 7. Find the MGF and r th moment for the distribution whose PDF is f(x) k e x, x >. Find also standard deviation. 8. The first bag contains 3 white balls, red balls and 4 black balls. Second bag contains white, 3 red and 5 black balls and third bag contains 3 white, 4 red and black balls. One bag is chosen at random and from it 3 balls are drawn. Out of 3 balls, balls are white and is red. What are the probabilities that they were taken from first bag, second bag and third bag. 9. A random variable X has the PDF f(x) x, < x < find (i) P (X < ½) (ii) P ( ¼ < X < ½) (iii) P ( X > ¾ / X > ½ ). If the density function of a continuous random variable X is given by ax x a x f(x) 3a ax x 3 otherwise () Find a () Find the cdf of X. If the the moments of a random variable X are defined by E ( X r ).6, r,,.. Show that P (X ).4 P ( X ).6, P ( X ).

93 In a continuous distribution, the probability density is given by f(x) kx ( x) < x <. Find k, mean, varilance and the distribution function. 3. The cumulative distribution function of a random variable X is given by, x < x, x ½ 3 F(x) - (3 ) x ½ x 3 5 x 3 Find the pdf of X and evaluate P ( X ) using both the pdf and cdf 4. Find the moment generating function of the geometric random variable with the pdf f(x) p q x-, x,,3.. and hence find its mean and variance. 5. A box contains 5 red and 4 white balls. A ball from the box is taken our at random and kept outside. If once again a ball is drawn from the box, what is the probability that the drawn ball is red? 6. A discrete random variable X has moment generating function 5 3 t M X(t) e Find E(x), Var(X) and P (X) The pdf of the samples of the amplitude of speech wave foem is found to decay exponentially at rate, so the following pdf is proposed f(x) x Ce, - < X <. Find C, E(x) 8. Find the MGF of a binomial distribution and hence find the mean and variance.. Find the recurrence relation of central moments for a binomial distribution. 9. The number of monthly breakdowns of a computer is a RV having a poisson distribution with mean equal to.8. Find the probability that this computer will function for a month (a) without a breakdown, (b) Wish only one breakdown, (c) Wish at least one break down.. Find MGF and hence find mean and variance form of binomial distribution.. State and prove additive property of poisson random variable.. If X and Y are two independent poisson random variable, then show that probability distribution of X given X+Y follows binomial distribution. 3. Find MGF and hence find mean and variance of a geometric distribution. 4. State and prove memory less property of a Geometric Distribution. 5. Find the mean and variance of a uniform distribution.

94 UNIT- II TWO DIMENSIONAL RANDOM VARIABLES Part- B. If f (x, y) x+y, < x <, < y <, Otherwise Compute the correlation cp- efficient between X and Y.. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) (8 /9) xy, x y find the marginal density unctions of X and Y. Find also the conditional density function of Y / X x, and X / Y y. 3. The joint probability density function of X and Y is given by f(x, y) (x + y) /3, x & <y < obtain the egression of Y on X and of X on Y. 4. If the joint p.d.f. of two random variable is given by f(x, x) 6 e -x 3 x, x >, x >. Find the probability that the first random variable will take on a value between and and the second random variable will take on avalue between and 3. \also find the probability that the first random variable will take on a value less than and the second random variable will take on a value greater than. 5. If two random variable have hoing p.d.f. f(x, x) ( / 3) (x+ x) < x <, < x < 6. Find the value of k, if f(x,y) k xy for < x,y < is to be a joint density function. Find P(X + Y < ). Are X and Y independent. 7. If two random variable has joint p.d.f. f(x, y) (6/5) (x +y ), < x <, < y <.Find P(. < X <.5) and P(.4 < Y <.6) 8. Two random variable X and Y have p.d.f f(x, y) x + ( xy / 3), x, y. Prove that X and Y are not independent. Find the conditional density function 9. X and Y are random variable joint p.d.f. f(x, y) 4xy the p. d. f. of x +y. Y X x y e, x,y, find. Two random variable X and Y have joint f(x y) x y, < x <, < y <. Find the Marginal probability density function of X and Y. Also find the conditional density unction and covariance between X and Y.. Let X and Y be two random variables each taking three values, and and having the joint p.d.f. Prove that X and Y have different expections. Also Prove that X and Y are uncorrelated and find Var X and Var Y

95 dice are thrown. Find the approximate probability tat the sum obtained is between 65 and 75 using central limit theorem. 3. Examine whether the variables X and Y are independent whose joint density is f(x,y) x e xy x, < x, y <. 4. Let X and Y be independent standard normal random variables. Find the pdf of z X / Y. 5. Let X and Y be independent uniform random variables over (,). Find the PDF of Z X + Y UNIT-III CLASSIFICATION OF RANDOM PROCESS PART B. The process { X(t) } whose proabability distribution is given by n at at P [ X(t) n], n,... n at, n at Show that it is not stationary. A raining process is considered as a two state Markov chain. If it rains, it is considered to be in state and if it does not rain, the chain is in state. the.6.4 transitioin probability of the markov chain is defined as P. Find the..8 probability of the Markov chain is defined as today assuming that it is raining today. Find also the unconditional probability that it will rain after three days with the initial probabilities of state and state as.4 and.6 respectively. 3. Let X(t) be a Poisson process with arrival rate. Find E {( X (t) X (s) } for t > s. 4. Let { Xn ; n,..} be a Markov chain on the space S {,,3} with on step transition probability matrix P / / () Sketch transition diagram () Is the chain irreducible? Explain. (3) Is the chain Ergodic? Explain 5. Consider a random process X(t) defined by X(t) U cost + (V+) sint, where U and V are independent random variables for which E (U ) E(V) ; E (U ) E

96 ( V ) () Find the auto covariance function of X (t) () IS X (t) wide sense stationary? Explain your answer. 6. Discuss the pure birth process and hence obtain its probabilities, mean and variance. 7. At the receiver of an AM radio, the received signal contains a cosine carrier signal at the carrier frequency with a random phase that is uniform distributed over (, ). The received carrier signal is X (t) A cos( t + ). Show that the process is second order stationary 8. Assume that a computer system is in any one of the three states busy, idle and under repair respectively denoted by,,. observing its state at pm each day,.6.. we get the transition probability matrix as P..8.. Find out the 3 rd.6.4 step transition probability matrix. Determine the limiting probabilities. 9. Given a random variable with density f ( ) and another random variable uniformly distributed in (-, ) and independent of and X (t) a cos ( t + ), Prove that { X (t)} is a WSS Process.. A man either drives a car or catches a train to go to office each day. He never goes days in a row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by train. Now suppose that on the first day of the week, the man tossed a fair die and drove to work iff a 6 appeared. Find () the probability that he takes a train on the 3 rd day. () the probability that he drives to work in the long run.. Show that the process X (t) A cos t + B sin t (where A and B are random variables) is wide sense stationary, if () E (A ) E(B) () E(A ) E (B ) and E(AB). Find probability distribution of Poisson process. 3. Prove that sum of two Poisson processes is again a Poisson process. 4. Write classifications of random processes with example Unit 4 Correlation and Spectrum Densities

97 V+ TEAM

98 V+ TEAM

99

100

101

102

103

104

105 B.E./B.Tech. DEGREE EXAMINATION, APRIL/MAY Fourth Semester Electronics and Communication Engineering MA6 PROBABILITY AND RANDOM PROCESS (Common to Bio Medical Engineering) (Regulation 8) Time : Three hours Maximum : marks Use of Statistical Tables is permitted Answer ALL Questions PART A ( x Marks) x. If the p.d.f of a random variable X is f ( x ) in x, find P ( X >.5 / X > ) t t 5 4. If the MGF of a uniform distribution for a random variable X is ( e e ) t > >., find E ( X ). 3. Find the value of k, if f ( x, y) k( x)( y) in < x, y < and f ( x, y ), otherwise, is to be the joint density function. 4. A random variable X has mean and variance 6. Find the lower bound for P(5 < X < 5). 5. Define a wide sense stationary process. 6. Define a Markov chain and give an example. 7. Find the mean of the stationary process{ x ( t )}, whose autocorrelation function is given by 9 R( τ ) τ 8. Find the power spectral density function of the stationary process whose autocorrelation function is given by e τ. 9. Define time invariant system.. State autocorrelation function of the white noise.

106 PART B (5 x 6 8 marks). (a) (i) The probability mass function of random variable X is defined as P ( X ) 3C,, ( ) 5 P ( X ) 4C C P X C, where C > and P ( X r) if r,,. Find () The value of C () P( < X < / x > ) (3) The distribution function of X (4) The largest value of X for which F ( x ) <. (ii) If the probability that an applicant for a driver s license will pass the road test on any given trial is.8. What is the probability that he will finally pass the test () On the fourth trial and () In less than 4 trials? Or (b) (i) Find the MGF of the two parameter exponential distribution whose density function is given by λ ( x a) f ( x) λe, x a and hence find the mean and variance. (ii) The marks obtained by a number of students in a certain subject are assumed to be normally distributed with mean 65 and standard deviation 5. If 3 students are selected at random from this group, what is the probability that two of them will have marks over 7?. (a) (i) Find the bivariate probability distribution of (X,Y) given below:

107 Y X /3 /3 /3 3/3 /6 /6 /8 /8 /8 /8 /3 /3 /64 /64 /64 Find the marginal distributions, conditional distribution of X given Y and conditional distribution of Y given X. (ii) Find the covariance of X and Y, if the random variable (X,Y) has the joint p.d.f f ( x, y) x + y, x, y and f ( x, y ), otherwise. Or (b) (i) The joint p.d.f of two dimensional random variable (X,Y) is given by 8 f ( x, y) xy, 9 x y and f ( x, y ), otherwise. Find the densities of X and Y, and the conditional densities f ( x / y) and f ( y / x ). (ii) A sample of size is taken from a population whose mean is 6 and variance is 4. Using Central Limit Theorem, find the probability with which the mean of the sample will not differ from 6 by more than (a) (i) Examine whether the random process { } X ( t) Acos( ω t + θ ) is a wide sense stationary if A and ωare constants and θ is uniformly distributed random variable in (,π). (ii) Assume that the number of messages input to a communication channel in an interval of duration t seconds, is a Poisson process with mean λ.3. Compute () The probability that exactly 3 messages will arrive during second interval () The probability that the number of message arrivals in an interval of duration 5 seconds is between 3 and 7. Or

108 (b) (i) The random binary transmission process { X ( t) } is a wide sense process with zero mean τ and autocorrelation function R( τ ), where T is a constant. Find the mean and variance T of the time average of{ X ( t) } over (, T). Is { X ( t) } mean ergodic? (ii) The transition probability matrix of a Markov chain{ X ( t )}, n,, 3,... having three states,, 3is..5.4 P P ( X 3) and P ( X 3, X 3, X 3, X ) (), and the initial distribution is [.7..]. P, Find 4. (a) (i) Find the autocorrelation function of the periodic time function of the period time function { } X ( t) Asinωt. (ii) The autocorrelation function of the random binary transmission { X ( t) } is given by τ R( τ ) for τ < T and R( τ ) for τ < T. Find the power spectrum of the T process { X ( t )}. (b) (i) { X ( t )} and { Y ( t) } Or are zero mean and stochastically independent random processes having autocorrelation functions R ( τ ) e τ and R ( τ ) cos πτ respectively. Find XX () The autocorrelation function of W ( t) X ( t) + Y ( t) and Z ( t) X ( t) Y ( t) () The cross correlation function of W ( t) and Z ( t ). (ii) Find the autocorrelation function of the process { X ( t) } for which the power spectral density is given by S ( ) XX ω + ω for ω < and SXX ( ω ) for ω >. 5. (a) (i) A wide sense stationary random process { ( )} YY a X t with autocorrelation R ( τ ) e τ where A and aare real positive constants, is applied to the input of an Linear transmission input bt system with impulse response h ( t) e u ( t) where b is a real positive constant. Find the autocorrelation of the output Y ( t) of the system. XX

109 (ii) If { ( )} X t is a band limited process such that S ( ω ) when ω > σ, prove that XX RXX () RXX ( τ ) σ τ RXX (). Or (b) (i) Assume a random process X ( t) is given as input to a system with transfer function H( ω ) for ω < ω < ω. If the autocorrelation function of the input process is find the autocorrelation function of the output process. N δ ( t), (ii) If Y ( t) Acos( ω t + θ ) + N ( t), where Ais a constant, θ is a random variable with a uniform distribution in ( π, π ) and { N ( t) } is a band limited Gaussian white noise with a N power spectral density SNN ( ω ) for ω ω < ω Band SNN ( ω ), elsewhere. Find the power spectral density of Y ( t ), assuming that N ( t) and θ are independent.

110 B.E./B.Tech. DEGREE EXAMINATIONS, APRIL/MAY Fourth Semester Electronics and Communication Engineering MA6 PROBABILITY AND RANDOM PROCESS (Common to Bio Medical Engineering) (Regulation 8) Time : Three hours Maximum : marks Answer ALL Questions PART A ( x Marks). The CDF of a continuous random variable is given by PDF and mean of X., x < F ( x). Find the x/ 5 e, x <. Establish the memoryless property of the exponential distribution. 3. Let X and Y be continuous random variables with joint probability density function x ( x y) f XY ( x, y), < x <, x < y < x and f XY ( x, y ) elsewhere. Find 8 fy / X ( y / x ). 4. Find the acute angle between the two lines of regression, assuming the two lines of regression. 5. Prove that a first order stationary process has a constant mean. 6. State the postulates of a Poisson process The autocorrelation function of a stationary random process is R( τ ) τ mean and variance of the process.. Find the 8. Prove that for a WSS process { } X ( t), R ( t, t + τ ) is an even function of τ. XX 9. Find the system Transfer function, if a Linear Time Invariant system has an impulse function ; t c H ( t) c. ; t c

111 . Define white noise. PART B (5 x 6 8 marks). (a) The probability density function of a random variable X is given by x, < x < f X ( x) k( x), x., otherwise () Find the value of k. (4) () Find P(. < x <.) (3) (3) What is P [.5 < x <.5 / x ] < < (4) (4) Find the distribution function of f ( x ). (5) Or (b) (i) Derive the m.g.f of Poisson distribution and hence or otherwise deduce its mean and variance. (ii) The marks obtained by a number of students in a certain subject are assumed to be normally distributed with mean 65 and standard deviation 5. If 3 students are selected at random from this set, what is the probability that exactly of them will have marks over 7?. (a) (i) If X and Y are independent Poisson random variables with respective parameters λ and λ. Calculate the conditional distribution of X, given that X + Y n. (ii) The regression equation of X on Y is 3Y 5X + 8. If the mean value of Y is 44 and the variance of X is 9/6 th of the variance of Y. Find the mean value of X and the correlation coefficient. Or

112 (b) (i) If X and Y are independent random variables with density function f Z X, x ( x) and, otherwise XY. f Y y, y 4 ( y) 6, find the density function of, otherwise (ii) The life time of a particular variety of electric bulb may be considered as a random variable with mean hours and standard deviation 5 hours. Using central limit theorem, find the probability that the average life time of 6 bulbs exceeds 5 hours. 3. (a) (i) A random process X ( t) defined by X ( t) Acos t + B sin t, < t <, where A and B are independent random variables each of which takes a value with probability / 3 and a value with probability / 3. Show that X ( t) is wide sense stationary. (ii) A random process has sample functions of the form X ( t) Acos( ω t + θ ) +, where ω is constant, A is a random variable with mean zero and variance one and θ is a random variable that is uniformly distributed between and π. Assume that the random variables A and θ are independent. Is X ( t) is a mean ergodic process? (b) (i) If { ( )} Or X t is a Gaussian process with ( t) probability that () X () 8 () X () X (6) 4 µ and ( ) C t, t 6e t t, find the (ii) Prove that the interval between two successive occurrences of a Poisson process with parameter λ has an exponential distribution with mean λ.

113 4. (a) (i) The power spectral density function of a zero mean WSS process X ( t) is given by, ω < ω π S( ω). Find R( τ ) and show that X ( t) and X t + are uncorrelated., otherwise ω (ii) The Auto correlation function of a WSS process is given by R( τ ) α e λ τ determine the power spectral density of the process. (b) (i) State and prove Weiner Khintchine Theorem. Or (ii) The cross power spectrum of real random processes { X ( t) } and { Y ( t) } S xy a + bj ω, for ω < ( ω). Find the cross correlation function., elsewhere is given by 5. (a) (i) Consider a system with transfer function. An input signal with autocorrelation + jω function m δ ( τ ) + m is fed as input to the system. Find the mean and mean square value of the output. (ii) A stationary random process X ( t) having the autocorrelation function R ( ) ( ) XX τ A δ τ is applied to a linear system at time t where f ( τ ) represent the bt impulse function. The linear system has the impulse response of h ( t) e u ( t) where u ( t) represents the unit step function. Find R ( τ ). Also find the mean and variance of Y ( t ). Or YY (b) (i) A linear system is described by the impulse response t RC h ( t) e u ( t). Assume an RC input process whose Auto correlation function is Bδ δ ( τ ). Find the mean and Auto correlation function of the output process.

114 (ii) If { N ( t) } is a band limited white noise centered at a carrier frequency ω such that S NN N, for ω ω < ω B ( ω), elsewhere. Find the autocorrelation of { N ( t )}.

115

116

117

118

119

120

121

122

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : Additional Problems MATERIAL CODE : JM08AM004 REGULATION : R03 UPDATED ON : March 05 (Scan the above QR code for the direct

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 03 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA 6 MATERIAL NAME : Problem Material MATERIAL CODE : JM08AM008 (Scan the above QR code for the direct download of

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Random Variables and Expectations

Random Variables and Expectations Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq

King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 215 STAT. By Weaam Alhadlaq King Saud University College of Since Statistics and Operations Research Department PROBABILITY (I) 25 STAT By Weaam Alhadlaq I Topics to be covered. General Review: Basic concepts of Probability and Random

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Review of Statistics I

Review of Statistics I Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

There are two basic kinds of random variables continuous and discrete.

There are two basic kinds of random variables continuous and discrete. Summary of Lectures 5 and 6 Random Variables The random variable is usually represented by an upper case letter, say X. A measured value of the random variable is denoted by the corresponding lower case

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Chapter 3 Discrete Random Variables

Chapter 3 Discrete Random Variables MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III

Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY SECOND YEAR B.Sc. SEMESTER - III SYLLABUS FOR S. Y. B. Sc. STATISTICS Academic Year 07-8 S.Y. B.Sc. (Statistics)

More information