5 Operations on Multiple Random Variables
|
|
- Amanda Allen
- 6 years ago
- Views:
Transcription
1 EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y (x, y)dxdy N r.v. s: ḡ = E[g(X, X2,,XN)] = g(x, x2,,xn)fx,x2,,xn (x, x2,,xn)dxdx2 dxn Engineering Dept.-JUST. EE360 Signal Analysis-Electrical - Electrical Engineering Department. EE360-Random Signal Analysis-Electrical Engineering Dept.-JUST. EE360-Random Signal Analysis-Electrical Engineering Department-JUST. EE360-Random
2 Chapter 5: Operations on Multiple Random Variables 2 Ex. 5.- g(x, X 2,,X N ) = N i= α ix i =weighted sum of r.v. s E[g(X, X 2,,X N )] = E[ N i= α ix i ] = N i= α ie[x i ]
3 Chapter 5: Operations on Multiple Random Variables 3 Joint Moments about the Origin joint moment m nk = E[X n Y k ] = xn y k f X,Y (x, y)dxdy m 0 = E[X], m 0 = E[Y ] Second order moment = correlation of X and Y = m m = E[XY ] = R XY = xyf X,Y (x, y)dxdy If R XY = E[XY ] = E[X]E[Y ] X, Y are uncorrelated. If X, Y are independent, then X, Y are uncorrelated but converse is not true (except for gaussian) If R XY = 0 X, Y are orthogonal.
4 Chapter 5: Operations on Multiple Random Variables 4 Ex Y = 6X + 22, X = 3, σ 2 X = 2 m 20 = E[X 2 ] = σ 2 X + ( X) 2 = = Ȳ = E[ 6X + 22] = 6 X + 22 = = 4 R XY = E[XY ] = E[ 6X X] = 6() + 22(3) = 0 X, Y are orthogonal. R XY E[X]E[Y ] X, Y are correlated
5 Chapter 5: Operations on Multiple Random Variables 5 Example If Y = ax + b, then X, Y are always correlated if a 0. R X,Y = E[XY ] = E[aX 2 + bx] = ae[x 2 ] + be[x]. If we want X, Y to be orthogonal, i.e. R X,Y = 0 = E[aX 2 + bx] b = ae[x 2 ]/E[X]
6 Chapter 5: Operations on Multiple Random Variables 6 Joint Moments about the origin N-dim. case m n,n 2,,n N = E[X n = Xn 2 2 Xn N with n i = 0,, i =, 2,,N N ] x n xn N N f X,,X N (x,,x N )dx dx N
7 Chapter 5: Operations on Multiple Random Variables 7 Joint Central Moments µ nk = E[(X X) n (Y Ȳ )k ] = (x X) n (y Ȳ )k f X,Y (x, y)dxdy µ 20 = E[(X X) 2 ] = σ 2 X and µ 02 = E[(Y Ȳ )2 ] = σ 2 Y Covariance of X, Y : C XY = µ = E[(X X)(Y Ȳ )] = E[XY XY Ȳ X + XȲ ] = R XY XȲ Ȳ X + XȲ C XY = R XY XȲ
8 Chapter 5: Operations on Multiple Random Variables 8 Comments on covariance of X,Y : C XY = R XY XȲ X, Y uncorrelated, i.e. R XY = XȲ C XY = 0 X, Y orthogonal, i.e. R XY = 0 C XY = XȲ X, Y orthogonal and ( X = 0)or(Ȳ = 0) C XY = 0
9 Chapter 5: Operations on Multiple Random Variables 9 Correlation Coefficient ρ ρ = µ µ20 µ 02 = C XY σ X σ Y = E Using Cauchy-Schwarz inequality [( X X σ X )( Y Ȳ σ Y )] E[U 2 ]E[V 2 ] (E[UV ]) 2 show that ρ Let U = X X σ X and V = Y Ȳ σ Y we get E[( X X σ X ) 2 ]E[( Y Ȳ σ Y ) 2 ] (E[ X X σ X Y Ȳ σ Y ]) 2 σ 2 X σ 2 X σ 2 Y σ 2 Y ρ 2 ρ 2
10 Chapter 5: Operations on Multiple Random Variables 0 N r.v. s µ n n 2 n N For N r.v. s X, X 2,,X N the (n + n n N )-order joint central moment is defined by µ n n 2 n N = E [ (X X ) n (X 2 X 2 ) n2 (X N X N ) n ] N
11 Chapter 5: Operations on Multiple Random Variables Ex Let Y = N i= α ix i, with α i are real weights. Find σ 2 Y Ȳ = E[Y ] = N i= α ie[x i ] = N i= α X i i Y Ȳ = N i= α i(x i X i ) and [ σy 2 = E[(Y Ȳ N )2 ] = E i= α i(x i X i ) N j= α j(x j X ] j ) = N N i= j= α iα j E[(X i X i )(X j X j )] = N N i= j= α iα j C Xi X j σ 2 Y = N i= N α i α j C Xi X j j= For X i are uncorrelated, i.e. C Xi X j = σ 2 X i δ(i j) we get σ 2 Y = N i= α2 i σ2 X i The variance of a weighted sum of uncorrelated random variables (weights α i ) equals the weighted sum of the variances of the random variables (weights α 2 i )
12 Chapter 5: Operations on Multiple Random Variables 2 Joint Characteristic Function - 2D Fourier Transform Φ X,Y (ω, ω 2 ) = E[e jω X+jω 2 Y ] with ω, ω 2 are real numbers. Φ X,Y (ω, ω 2 ) = f X,Y (x, y)e jω x+jω 2 y dxdy f X,Y (x, y) = (2π) 2 Marginal Characteristic fcns: Φ X (ω ) = Φ X,Y (ω, 0), Φ Y (ω 2 ) = Φ X,Y (0, ω 2 ) Φ X,Y (ω, ω 2 )e jω x jω 2 y dω dω 2 Joint m nk moment: m nk = ( j) n+k n+k Φ X,Y (ω,ω 2 ) ω n ωk ω =ω 2 =0 2
13 Chapter 5: Operations on Multiple Random Variables 3 Ex on using m nk, m nk = ( j) n+k n+k Φ X,Y (ω,ω 2 ) ω n ωk ω =ω 2 =0 2 Given Φ X,Y (ω, ω 2 ) = e 2ω2 8ω2 2, find X, Ȳ, R XY, C XY X = m 0 = j Φ X,Y (ω,ω 2 ) ω ω =ω 2 =0 = j( 4ω )e 2ω2 8ω2 2 ω =ω 2 =0 = 0 Ȳ = m 0 = j Φ X,Y (ω,ω 2 ) ω 2 ω =ω 2 =0 = j( 6ω 2 )e 2ω2 8ω2 2 ω =ω 2 =0 = 0 R XY = m = ( j) 2 2 ω ω 2 e (2ω2 +8ω2 2 ) ω =ω 2 =0 = 0 C XY = R XY XȲ = 0 X, Y are Uncorrelated
14 Chapter 5: Operations on Multiple Random Variables 4 Joint Characteristic function for N r.v.s X, X 2,,X N r.v.s, then Φ X,,X N (ω,,ω N ) = E(e jω X + +jω N X N ) and the joint moments are obtained from m n n 2 n N = ( j) n + +n N n + +n N Φ X,,X N (ω, ω 2,,ω N ) ω n ωn 2 2 ωn N N all ωk =0
15 Chapter 5: Operations on Multiple Random Variables 5 Example Y = X + X X N where X i, i =, 2,,N are statistically independent r.v.s with f Xi (x i ) and Φ Xi (ω i ) Φ X,,X N (ω,,ω N ) = N Φ Xi (ω i ) i=
16 Chapter 5: Operations on Multiple Random Variables 6 Jointly Gaussian Two R.V.s f X,Y (x, y) = = 2πσ X σ e Y ρ 2 [ (x X) 2 σ 2 X ] 2ρ(x X)(y Ȳ ) (y Ȳ )2 + σ X σ Y σ 2 Y 2( ρ 2 ) ( [ x ) X 2 2ρ ( x X σ X 2πσ X σ e Y ρ 2 )( y ( Ȳ y Ȳ )+ σ X σ Y 2( ρ 2 ) σ Y ) 2 ] with ρ = C XY σ X σ Y Maximum occurs at x = X, y = Ȳ, i.e. f X,Y (x, y) f X,Y ( X, Ȳ ) = 2πσ X σ Y ρ 2
17 Chapter 5: Operations on Multiple Random Variables 7 Jointly Gaussian Uncorrelated R.V.s are Independent Case of uncorrelated X, Y, i.e. ρ = 0 f X,Y (x, y) = = 2πσ X σ e Y ρ 2 [ e 2πσ X σ Y (x X) 2 σ 2 X [ (x X) 2 σ 2 X ] (y Ȳ )2 + σ 2 Y ] 2ρ(x X)(y Ȳ ) (y Ȳ )2 + σ X σ Y σ 2 Y 2( ρ 2 ) Uncorrelated Gaussian X, Y f X,Y (x, y) = f X (x)f Y (y) X, Y independent. 2
18 Chapter 5: Operations on Multiple Random Variables 8 Can we remove correlation between 2 r.v. s by proper rotation θ? Ex For any X, X 2 r.v.s, we can form two new r.v.s Y, Y 2 by rotating the axes an angle θ to make Y, Y 2 uncorrelated. X2 Y x2 (x,x2)=(y,y2) Y2 y Y = X cos θ + X 2 sinθ Y 2 = X sinθ + X 2 cos θ Want θ that makes C Y Y 2 = 0. y2 θ x X
19 Chapter 5: Operations on Multiple Random Variables 9 C Y Y 2 = µ = E[(Y Ȳ)(Y 2 Ȳ2)] = E[{(X X ) cos θ + (X 2 X 2 ) sinθ} { (X X ) sinθ + (X 2 X 2 ) cos θ}] = E[ (X X ) 2 cos θ sinθ + (X 2 X 2 ) 2 sinθ cos θ + (X X )(X 2 X 2 )(cos 2 θ sin 2 θ)] = σx 2 sinθ cos θ + C X X 2 cos 2 θ C X X 2 sin 2 θ + σx 2 2 sinθ cos θ = 2 (σ2 X σx 2 2 ) sin(2θ) + C X X 2 cos(2θ) Set C Y Y 2 = 0 we get 2 (σ2 X σ 2 X 2 ) sin(2θ) = C X X 2 cos(2θ) = ρσ X σ X2 cos(2θ) tan(2θ) = 2ρσ X σ X2 σ 2 X σ 2 X 2 θ = 2 tan ( 2ρσX σ X2 σ 2 X σ 2 X 2 )
20 Chapter 5: Operations on Multiple Random Variables 20 Jointly Gaussian - N - r.v.s X,X 2,,X N f X,,X N (X,,X N ) = [C X] /2 e { (2π) N/2 2 (x X) t [C X ] (x X)} where (x X) = x X x 2 X 2. x N X N C ij = E[(X i X i )(X j X j )] =, C X = σ 2 X i C Xi X j C C 2 C N C 2 C 22 C 2N... C N C N2 C NN i = j i j
21 Chapter 5: Operations on Multiple Random Variables 2 Case of N = 2 [C X ] = σ 2 X ρσ X σ X2 ρσ X σ X2 σ 2 X 2, [C X ] = ( ρ 2 )σ 2 X σ 2 X 2 [C X ] = ρ 2 σ 2 X f X,X 2 (x, x 2 ) = σ 2 exp{ t 2 (x X) ρ 2 f X,X 2 (x, x 2 ) = ρ σ X σ X2 ρ σ X σ X2 σ 2 X 2 X σ 2 X 2 ( ρ 2 ) σ 2 X exp{ 2πσ 2 X 2πσ 2 ( ρ 2 ) X 2 (2π) 2/2 ρ σ X σ X2 ρ σ X σ X2 σ 2 X 2 (x X)} can verify [ (x X ) 2 2σ 2 X + (x 2 X 2 ) 2 2σ 2 X 2 2ρ(x X )(x 2 X 2 ) 2σ X σ X2 ]}
22 Chapter 5: Operations on Multiple Random Variables 22 Notes on Gaussian r.v.s. Only mean, variance, and covariance are needed to completely characterize gaussian r.v.s. 2. Uncorrelated statistically independent, 3. X i, i =, 2,,n are gaussian, n i= a ix i is gaussian. 4. Any k-dim marginal density is also gaussian. 5. Conditional density is also gaussian, i.e., f X,X 2,,X k (x, x 2,,x k {X k+ = x k+,,x N = x N }) gaussian.
23 Chapter 5: Operations on Multiple Random Variables 23 Linear Transformation of Multiple r.v.s Y = TX where Y is an N vector, T is an N N matrix, X is an N vector E[Y ] = TE[X] E[Y Y t ] = E[TXX t T t ] R Y = TR X T t also E[(Y Ȳ )(Y Ȳ )t ] = E[T(X X)(X X) t T t ] from which we get C Y = TC X T t
24 Chapter 5: Operations on Multiple Random Variables 24 Ex Transformation of Multiple r.v.s Gaussian X N(0, 4), X 2 N(0, 9), C X X 2 = 3. Let Y = X 2X 2, Y 2 = 3X + 4X 2. Find means, variances, and covariance of Y and Y 2. E[Y ] = E[X ] 2E[X 2 ] = 0, E[Y 2 ] = 0. E[Y 2 ] = E[X 2 ] 4C X X 2 + 4E[X 2 2] = = 28 E[Y 2 2 ] = E[9X 2 ] + 24C X X 2 + 6E[X 2 2] = = 252 E[Y Y 2 ] = E[3X 2 2X X 2 8X 2 2] = = 66
25 Chapter 5: Operations on Multiple Random Variables 25 Example Gaussian random vector, X N(µ, C X ) with 0 µ = 5, C X = Find. pdf of X : marginal of jointly gaussian is gaussian. X N(, ) 2. pdf of X 2 + X 3 : here C 23 = C 32 = 0 X 2, X 3 uncorrelated, since gaussian independent. Sum of two jointly gaussian is also gaussian. Mean will add and variance will add. X 2 + X 3 N(7, 3). 3. pdf of 2X + X 2 + X 3 : linear combination of gaussian r.v.s, i.e.
26 2X + X 2 + X 3 = [2,, ] Chapter 5: Operations on Multiple Random Variables 26 X X 2, X 3 mean=µ = 2 X + X 2 + X 3 = 2() + (5) + (2) = σ 2 = [2,, ] 4 0 = [3, 6, 9] = X + X 2 + X 3 N(9, 2) 4. pdf of X 3 (X, X 2 ) = f(x 3 X, X 2 ) =? C 23 = C 3 = 0 X 3, X 2 stat. independent and X 3, X stat. independent, f(x 3 X, X 2 ) = f(x 3 ) X 3 (X, X 2 ) N(2, 9) 5. P {2X + X 2 + X 3 < 0} =? Y = 2X + X 2 + X 3 as in previous part N(9, 2)
27 Chapter 5: Operations on Multiple Random Variables 27 P {Y < 0} = Φ( 0 Ȳ σ Y Y = TX, with T = 2 Ȳ = T X = C Y = TC X T t = ) = Φ( 9 2 ) = Φ(.96) = Φ(.96) = 2 hence Y N( Ȳ, C Y ) = =
28 Chapter 5: Operations on Multiple Random Variables 28 Sampling and Some Limit Theorems Sampling and Estimation Estimation of Mean, Power, and Variance Given N samples x n representing values of independent (at least pair-wise) identically distributed X n, n =, 2,,N. Define the r.v. ˆ XN as follows ˆ x N = N N n= x n its mean E[ ˆ XN ] = N N E[X n ] = X, for any N n= This is an Unbiased estimator mean of estimate=mean of the r.v. and its variance σ 2ˆ = E[( ˆ XN X) 2 ] = E[ ˆ X2 X N 2 X ˆ XN + X 2 ] N
29 Chapter 5: Operations on Multiple Random Variables 29 = ˆ X2 + E[ N = ˆ X2 + N 2 N n= N n= m= X n N N N m= X m ] E[X n X m ] Since X n and X m are pairwise independent identically distributed, then X 2 for m n E[X n X m ] = E[X 2 ] for m = n N N E[X n X m ] = NE(X 2 ) + (N 2 N) X 2 n= m=
30 Chapter 5: Operations on Multiple Random Variables 30 Hence σ 2ˆ = ˆ X2 + X N N 2 [NE(X2 )+(N 2 N) X 2 = N [E(X2 ) X 2 ] = σ2 X N 0 as N. hence σ ˆ X2 N Using Chebychev s inequality, P { ˆ XN X σ 2ˆ X < ǫ} ( N ǫ 2 Consistent estimator: Weak Law of Large Numbers: = σ2 X Nǫ 2 ) as N ˆ X N X with probability as N lim P { ˆ XN X < ǫ} =, for any ǫ > 0 N Strong Law of Large Numbers: P { lim ( ˆ XN ) = X } N =
31 Chapter 5: Operations on Multiple Random Variables 3 Complex Random Variables Z = X + jy, X, Y are real r.v.s E[g(Z)] = g(z)f X,Y (x, y)dxdy Z = X + jȳ σ 2 Z = E[ Z E[Z] 2 ] For two complex r.v.s Z m, Z n : joint pdf f Xm,Y m,x n,y n (x m, y m, x n, y n ) If f Xm,Y m,x n,y n (x m, y m, x n, y n ) = f Xm,Y m (x m, y m )f Xn,Y n (x n, y n ) Z m, Z n statistically independent. Can extend to N r.v.s
32 Chapter 5: Operations on Multiple Random Variables 32 Complex vars.: Correlation, Covariance, Independence, Orthogonal Correlation R Zm Z n = E[Z mz n ], n m Covariance C Zm Z n = E[(Z m Z m ) (Z n Z n )], n m Uncorrelated Complex r.v.s R Zm Z n = E[Z m]e[z n ], n m, C Zm Z n = 0 Independence Uncorrelation Orthogonal R Zm Z n = E[Z mz n ] = 0
33 Chapter 5: Operations on Multiple Random Variables 33 Summary Extend chapter 3 to work on multiple random variables. Topics extended were: Expected values were developed of functions of random variables, which included both joint moments about the origin and central moments, as well as joint characteristic functions that are useful in finding moments. New moments of special interest were correlation and covariance. Multiple gaussian random variables were defined. Transformation results were used to show how linear transformation of jointly gaussian random variables is especially important, as it produces random variables that are also joint gaussian. Some new material on the basics of sampling and estimation of mean, power, and variance was given. Definition of complex random variables and their characteristics.
conditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationProbability Background
Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationA Probability Review
A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationProblem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51
Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More information3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationLectures 22-23: Conditional Expectations
Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation
More informationRecall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.
Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,
More informationChapter 4 continued. Chapter 4 sections
Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More informationEE 438 Essential Definitions and Relations
May 2004 EE 438 Essential Definitions and Relations CT Metrics. Energy E x = x(t) 2 dt 2. Power P x = lim T 2T T / 2 T / 2 x(t) 2 dt 3. root mean squared value x rms = P x 4. Area A x = x(t) dt 5. Average
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationJoint Gaussian Graphical Model Review Series I
Joint Gaussian Graphical Model Review Series I Probability Foundations Beilun Wang Advisor: Yanjun Qi 1 Department of Computer Science, University of Virginia http://jointggm.org/ June 23rd, 2017 Beilun
More informationSTAT 430/510: Lecture 16
STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationRandom Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.
Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1
EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation
More informationECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.
ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationBivariate Distributions. Discrete Bivariate Distribution Example
Spring 7 Geog C: Phaedon C. Kyriakidis Bivariate Distributions Definition: class of multivariate probability distributions describing joint variation of outcomes of two random variables (discrete or continuous),
More informationMATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2
MATH 3806/MATH4806/MATH6806: MULTIVARIATE STATISTICS Solutions to Problems on Rom Vectors Rom Sampling Let X Y have the joint pdf: fx,y) + x +y ) n+)/ π n for < x < < y < this is particular case of the
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationPreliminary Statistics. Lecture 3: Probability Models and Distributions
Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationInteresting Probability Problems
Interesting Probability Problems Jonathan Mostovoy - 4665 University of Toronto August 9, 6 Contents Chapter Questions a).8.7............................................ b)..............................................
More informationUNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS
UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS In many practical situations, multiple random variables are required for analysis than a single random variable. The analysis of two random variables especially
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationESTIMATION THEORY. Chapter Estimation of Random Variables
Chapter ESTIMATION THEORY. Estimation of Random Variables Suppose X,Y,Y 2,...,Y n are random variables defined on the same probability space (Ω, S,P). We consider Y,...,Y n to be the observed random variables
More informationPartial Solutions for h4/2014s: Sampling Distributions
27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More information3 Operations on One R.V. - Expectation
0402344 Engineering Dept.-JUST. EE360 Signal Analysis-Electrical - Electrical Engineering Department. EE360-Random Signal Analysis-Electrical Engineering Dept.-JUST. EE360-Random Signal Analysis-Electrical
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationReview of Probability Theory II
Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More information3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.
3. General Random Variables Part IV: Mul8ple Random Variables ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. Ilya Pollak Joint PDF of two con8nuous r.v. s PDF of continuous r.v.'s
More informationSolutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π
Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More information