3 Operations on One R.V. - Expectation
|
|
- Tamsin Hodge
- 5 years ago
- Views:
Transcription
1 Engineering Dept.-JUST. EE360 Signal Analysis-Electrical - Electrical Engineering Department. EE360-Random Signal Analysis-Electrical Engineering Dept.-JUST. EE360-Random Signal Analysis-Electrical Engineering Department-JUST. EE360-Random EE360 Random Signal analysis Chapter 3: Operations on One R.V - Expectation Operations on One R.V. - Expectation Expectation=Averaging=Mean value=statistical Average X = E[X] = = N i=1 xfx(x)dx continuous xip(xi) discrete If X has pdf that is symmetrical around x = a, i.e. fx(x + a) = fx( x + a), E[X] = a
2 Chapter 3: Operations on One R.V - Expectation - 2 Ex Ninty people have change as follows # of people change in cents Average=(8*18+12*45+28*64+22*72+15*77+5*95)/( )=63.22 Cents. or we can write it as Average=(8/90)*18+(12/90)*45+(28/90)*64+(22/90)*72+(15/90)*77+(5/90)*95=63.22 Cents
3 Chapter 3: Operations on One R.V - Expectation - 3 Example Find X for the continuous, exponential r.v. with f X (x) = 1 b e (x a)/b x > a 0 x < a E[X] = a x 1 b e (x a)/b dx = ea/b xe x/b dx = a + b b a xe ax dx = e ax [ x a 1 a 2 ] from Appendix C
4 Chapter 3: Operations on One R.V - Expectation - 4 Expected value of a function of a R.V. E[g(X)] = = g(x)f X (x)dx continuous N g(x i )P(x i ) discrete i=1 If g(x) = N k=1 g k(x) E(g(X)) = N k=1 E(g k(x))
5 Chapter 3: Operations on One R.V - Expectation - 5 Example V =Rayleigh r.v., a = 0, b = 5. Y = g(v ) = V 2 E[g(V )] = E[V 2 ] = = 0 0 v 2 f V (v)dv = 2v e v /5 dv a v 2 2 b (v a)e (v a)2 /b dv Let ξ = v2 5, dξ = 2vdv 5, substitute E[V 2 ] = 5 0 ξe ξ dξ = 5
6 Chapter 3: Operations on One R.V - Expectation - 6 Ex Average Information (Entropy) X can be one of L symbols, each having probability of P(x i ). Define the information in a symbol x i as log 2 ( 1 P(x i ) ) = log 2(P(x i )) Average entropy of the discrete source = H bits per symbol= H = L P(x i ) log 2 (P(x i )) i=
7 Chapter 3: Operations on One R.V - Expectation - 7 Conditional Expected Value E(X B) = xf X (x B)dx If B = {X b}, < b < and using the results from chapter 2: f X (x) b x < b f X (x X b) = f X(x)dx 0 x b to get E(X X b) = xf X(x)dx b f X(x)dx = b xf X(x)dx b f X(x)dx
8 Chapter 3: Operations on One R.V - Expectation - 8 Moments 1. Moments about the Origin m n = nth moment = E[X n ] = Here m 0 = 1.0, and m 1 = X x n f X (x)dx 2. Central Moments µ n = nth central moment = E[(X X) n ] = Here µ 0 = 1.0, and µ 1 = 0 (x X) n f X (x)dx
9 Chapter 3: Operations on One R.V - Expectation - 9 Variance and Skew σ 2 X = Variance = µ 2 =second central moment = = E[(X X) 2 ] = (x X) 2 f X (x)dx = E[X 2 2 XX + X 2 ] = E[X 2 ] 2 XE[X] + X 2 = E[X 2 ] X 2 = m 2 m Skew of the density function=the third central moment µ 3 = E[(X X) 3 ] is a measure of the asymmetry of f X (x) about x = X = m 1. Skewness or Coefficient of skewness= µ 3 σ 3 X If f X (x) is symmetric about x = X, it has zero skew.
10 Chapter 3: Operations on One R.V - Expectation - 10 Example X is exponentially distributed r.v. E[X] = a x 1 b e (x a)/b dx = a + b done in Example σ 2 X = a (x X) 2 1 b e (x a)/b dx Let ξ = x X, σ 2 X = e ( X a)/b b a X ξ 2 e ξ/b dξ = (a + b X) 2 + b 2 = b
11 Chapter 3: Operations on One R.V - Expectation - 11 Example For the previous example (exponential pdf) compute the skew and coefficient of skewness. E[(X X) 3 ] = E[X 3 3 XX X 2 X X 3 ] = E(X 3 ) 3 XE[X 2 ] + 2 X 3 = E(X 3 ) 3 X(σ X 2 + X 2 ) + 2 X 3 = E(X 3 ) 3 Xσ X 2 X 3 E[X 3 ] = a x 3 and since X = a + b and σ 2 X = b2 we get b e (x a)/b dx = a 3 + 3a 2 b + 6ab 2 + 6b µ 3 = 2b 3 and µ 3 σ 3 X = 2
12 Chapter 3: Operations on One R.V - Expectation - 12 Chebychev s Inequality For any r.v. X having mean= X and variance=σ 2 X, then P { X X ǫ} σ2 X ǫ 2, ǫ > 0 P { X X < ǫ} 1 σ2 X ǫ 2, ǫ > 0 Proof: σ 2 X = ǫ 2 (x X) 2 f X (x)dx x X ǫ x X ǫ f X (x)dx = ǫ 2 P { X X ǫ} (x X) 2 f X (x)dx
13 Chapter 3: Operations on One R.V - Expectation - 13 Chebychev s Inequality - alternate form For any r.v. X having mean= X and variance=σ 2 X, then P { X X kσ X } 1 k 2, k > 0 P { X X < kσ X } > 1 1 k 2, k > 0 Proof: σ 2 X = k 2 σ 2 X (x X) 2 f X (x)dx x X kσ X (x X) 2 f X (x)dx x X kσ X f X (x)dx = k 2 σ 2 XP { X X kσ X } from which P { X X kσ X } 1 k 2, k > 0
14 Chapter 3: Operations on One R.V - Expectation - 14 Example Find the largest value of P { X X 3σ X }. Use Chebychev s inequality with ǫ = 3σ X to get P { X X 3σ X } σ2 X (3σ X ) 2 = 1 9 = 11.1%
15 Chapter 3: Operations on One R.V - Expectation - 15 Markov s Inequality If X is a non-negative r.v., P {X a} E[X] a, a > 0 Proof: E[X] = 0 xf X (x)dx a xf X (x)dx a a f X (x)dx = ap {Y a}
16 Chapter 3: Operations on One R.V - Expectation - 16 Bienayme Inequality In Markov inequality P {Y α} E[Y ] α, α > 0, let Y = X a n 0, and let α = ǫ n P { X a n ǫ n } E[ X a n ] ǫ n P { X a ǫ} E[ X a n ] ǫ n Can derive Chebyshev by setting a = η = X, n = 2, ǫ = kσ and substitute in Bienayme inequality
17 Chapter 3: Operations on One R.V - Expectation - 17 Characteristic Function Characteristic function of a r.v. X is defined as and Φ X (ω) = E[e jωx ], j = 1, < ω < = f X (x) = 1 2π f X (x)e jωx dx Φ X (ω)e jωx dω The n th moment of X is given by 1. Φ X (ω) always exists. m n = ( j) n dn Φ X (ω) dω n ω= Φ X (ω) < Φ X (0) = 1 (Problem 3.3.1)
18 Chapter 3: Operations on One R.V - Expectation - 18 Ex Find the characteristic fcn Φ X (ω) for an exponential r.v. X Φ X (ω) = E[e jωx ] = = e a b b = Its first moment m 1 = ( j) dφ X(ω) dω a e jωa 1 jωb f X (x)e jωx 1 x a dx = a b e b e jωx dx [ ] e ( 1 b jω)x dx = e a b e ( 1 b jω)x b ( 1 b jω) a [ ] ω=0 = ( j)e jωa ja 1 jωb + jb (1 jωb) 2 ω=0 = a+b
19 Chapter 3: Operations on One R.V - Expectation - 19 Moment Generating Function M X (ν) = E[e νx ], where ν is a real number < ν < = f X (x)e νx dx If M X (ν) exists, moments can be evaluated from the relation m n = dn M X (ν) dν n ν=
20 Chapter 3: Operations on One R.V - Expectation - 20 Example Find M X (ν) and its first moment for an exponential r.v. X M X (ν) = E[e νx ] = = e a b b a f X (x)e νx dx = e [ν (1/b)]x dx = eaν 1 bν a 1 (x a) b e b e νx dx its first moment can be evaluated from the relation m 1 = dm X(ν) ν=0 = eaν [a(1 bν) + b] dν (1 bν) 2 ν=0 = a + b
21 Chapter 3: Operations on One R.V - Expectation - 21 Chernoff s Inequality and Bound Ex Chernoff s Inequality: P {X a} e νa M X (ν) a arbitrary real constant, ν > 0. Proof: note that e ν(x a) u(x a) P {X a} = a f X (x)dx = f X (x)u(x a)dx f X (x)e ν(x a) dx = e νa M X (ν) Chernoff s bound: Take the minimum w.r.to ν gives Chernoff s bound.
22 Chapter 3: Operations on One R.V - Expectation - 22 Transformations of a Random Variable Given r.v. X having pdf f X (x) and another r.v. Y that relates to X as Y = T(X), where T(.) is a transformation (mapping) from X to Y. Find f Y (y) =?
23 Chapter 3: Operations on One R.V - Expectation - 23 Monotonic Transformation of a Continuous R.V. y y = y 1 y x 1 x y = y 1 x 1 x Figure 1: Monotonic Transformation: (a) increasing (b) decreasing
24 Chapter 3: Operations on One R.V - Expectation - 24 Monotonic Transformation of a Continuous R.V. y y = y 1 x 1 x all probability measures, i.e. +ve quantities. P {y 1 Y < y 1 + y} = P {x 1 X < x 1 + x} f Y (y) y = f X (x) x from which f Y (y) = f X(x) dy dx
25 Chapter 3: Operations on One R.V - Expectation - 25 Ex Linear transformation T such that Y = T(X) = ax + b, a, b real constants, then X = T 1 (Y ) = (Y b)/a and dx dy = 1 a and 1 If f X (x) = 2πσ 2 X e (x ax) 2 f Y (y) = f X ( y b a ) 1 a (2σ 2 X ), then f Y (y) = 1 e [ y b a a X ]2 2σ 2 X 1 2πσ 2 X a = 1 2πa2 σx 2 e [y (aax+b)] 2 2a 2 σ 2 X which is a gaussian r.v. with mean=a Y = aa X + b and variance=σ 2 Y = a2 σ 2 X
26 Chapter 3: Operations on One R.V - Expectation - 26 Nonmonotonic Transformation of a Cont. R.V. y y = y 1 x 1 x 2 x 3 x all probability measures, i.e. +ve quantities. P {y 1 Y < y 1 + y} = 3 i=1 P {x i X < x i + x} f Y (y 1 ) y = 3 i=1 f X(x i ) x from which f Y (y 1 ) = 3 i=1 f X (x i ) dy dx x=x i
27 Chapter 3: Operations on One R.V - Expectation - 27 Transformation of r.v.s: Example X has pdf = f X (x). Y = T(X) = cx 2, c real > 0. Find distribution of Y? F Y (y) = P {Y y} = P {cx 2 y} = P { y/c X y/c} = F X ( y/c) F X ( y/c) from which its derivative f Y (y) = f X ( y/c) d( y/c) dy = f X( y/c) + f X ( y/c) 2, y 0 cy f X ( y/c) d( y/c) dy
28 Chapter 3: Operations on One R.V - Expectation - 28 Transformation of r.v.s: Example-Ross X u(0, 1). Y = X n. Find distribution of Y for 0 y 1 F Y (y) = P {Y y} = P {X n y} = P {X Y 1 n } = F X (y 1 n ) = y 1 n from which its derivative f Y (y) = ( 1 n y 1 n 1 )(u(y) u(y 1))
29 Chapter 3: Operations on One R.V - Expectation - 29 Transformation of a Discrete R.V.: Example R.V. X have values x = 1, 0, 1, 2 with probabilities 0.1, 0.3, 0.4, 0.2, i.e. f X (x) = 0.1δ(x + 1) + 0.3δ(x) + 0.4δ(x 1) + 0.2δ(x 2). If T(X) = Y = 2 X 2 + (X 3 /3), find f Y (y) =? The corresponding values of y are 2/3, 2, 4/3, and 2/3. Here P {Y = 2/3} = P {X = 1} + P {X = 2} and f Y (y) = 0.3δ(y (2/3)) + 0.4δ(y (4/3)) + 0.3δ(y 2)
30 Chapter 3: Operations on One R.V - Expectation - 30 Computer Generation of a One R.V. Find the transformation T(X) that will create a r.v. Y of prescribed distribution function when X has a uniform distribution on (0,1). Pick x (0, 1). y = T(x), x = T 1 (y). Here we put x as F Y (y), i.e. F Y (y) = x,0 < x < 1. Solve for y = T(x).
31 Chapter 3: Operations on One R.V - Expectation - 31 Computer Generation of a One R.V. - Cont. -1 : Example Generate Rayleigh r.v. with a = 0. x = F Y (y) = 1 e y2 /b, 0 < x < 1. 1 x = e y2 /b y 2 /b = ln(1 x) y = T(x) = b ln(1 x), 0 < x < 1
32 Chapter 3: Operations on One R.V - Expectation - 32 Computer Generation of a One R.V. - Cont. -2 : Example Generate Arcsine r.v. F Y (y) = 0 y a π sin 1 ( y a ) a < y < a 1 a y The solution y = a sin[π(2x 1)/2], 0 < x < 1 = 0, x, 1,
33 Chapter 3: Operations on One R.V - Expectation - 33 Computer Generation of a One R.V. - Cont. -3 : Gaussian from Uniform - Box-Muller s Transformation R 1, R 2 [0, 1]. Generate two Gaussian r.v. s N 1, N 2 both zero mean and unit variance as follows: N 1 = ( 2 lnr 1 ) 1/2 cos(2πr 2 ) and N 2 = ( 2 lnr 1 ) 1/2 sin(2πr 2 ) Note that r 2 = N1 2 + N2 2 = 2 lnr 1 and Φ = 2πR 2 If two gaussian r.v. s with mean m and variance=σ 2, X and Y N(m, σ 2 ) are needed we use the transformations: X = m + σn 1 and Y = m + σn 2
34 Chapter 3: Operations on One R.V - Expectation - 34 Figure 2: A Square-law Transformation
35 Chapter 3: Operations on One R.V - Expectation - 35 Figure 3: Example 3.5.3: Uniform R.V.: histogram and true density function
36 Chapter 3: Operations on One R.V - Expectation - 36 Generation of Rayleigh CPDF and PDF %Rayleigh ditributed r.v. x=rand(10000,1); a=1.5; b=2; %b > 0 y=a+sqrt(-b*log(1-x)); [X,IX]=sort(x); Y=y(IX); plot(y,x); %Distribution function print -deps rayleighcpdf.eps pause N=100 hist(y,n); %plot pdf into N range bins print -deps rayleighpdf.eps
37 Chapter 3: Operations on One R.V - Expectation - 37 Rayleigh Distribution PDF for Rayleigh CPDF for Rayleigh
38 Chapter 3: Operations on One R.V - Expectation - 38 Generation of Gaussian r.v. % Normal distribution Npoints=10000; r1=rand(npoints,1); r2=rand(npoints,1); n1=sqrt(-2*log(r1)).*cos(2*pi*r2); n2=sqrt(-2*log(r1)).*sin(2*pi*r2); Nbins=100; hist(n1,nbins); print -deps gaussianpdf.eps
39 Chapter 3: Operations on One R.V - Expectation - 39 Gaussian PDF PDF for Gaussian
40 Chapter 3: Operations on One R.V - Expectation - 40 Gaussian Distribution Gaussian pdf Gaussian cpdf mean=3, sigma= mean=3, sigma=1 PDF for Gaussian CPDF for Gaussian
41 Chapter 3: Operations on One R.V - Expectation y = x
42 Chapter 3: Operations on One R.V - Expectation - 42 Problem page 102 Find Φ X (ω) for the binomial N Φ X (ω) = E[e jωx ] = P {X = k}e jωx N = N k=0 k N = N k k=0 k=0 = ((1 p) + pe jω ) N p k (1 p) N k e jωk (pe jω ) k (1 p) N k
43 Chapter 3: Operations on One R.V - Expectation - 43 Problem page 99 Find X,σ 2 X for the Binomial N X = E[X] = kp {X = k} k=0 N = k N k k=0 p k (1 p) N k Since (a + b) N = N k=0 N k a k b N k, take a to get N(a + b)n 1 = N k=0 N k ka k 1 b N k,
44 Chapter 3: Operations on One R.V - Expectation - 44 then multiplying both sides by a to get Na(a + b) N 1 = N k=0 N k ka k b N k. Now if we set a = p, b = 1 p we get Np(p + (1 p)) N 1 = N k=0 N k kp k (1 p) N k = Np = X For σ 2 X take the second derivative w.r.to a: N(N 1)a(a + b) N 2 + N(a + b) N 1 = N k=0 N k k 2 a k 1 b N k after multiplying both sides by a and letting a = p, b = 1 p we get N(N 1)p 2 + Np = E[X 2 ] = m 2 = N 2 p 2 Np 2 + Np = N 2 p 2 + Np(1 p) = N 2 p 2 + Np(1 p). σ 2 X = m2 m2 1 = N 2 p 2 + Np(1 p) (Np) 2 = Np(1 p) An alternative solution is to calculate this via Φ X (ω).
45 Chapter 3: Operations on One R.V - Expectation - 45 Problem X,σ 2 X for the Uniform (a,b) f X (x) = 1 b a (u(x a) u(x b)) X = b a x 1 b a dx = b2 a 2 2(b a) = a+b 2 m 2 = b a x2 1 b a dx = b3 a 3 3(b a) = b2 +ab+a 2 3 σx 2 = m 2 m 2 1 = b2 +ab+a 2 3 (a+b)2 4 = b2 2ab+a 2 12 = (b a)2 12
46 Chapter 3: Operations on One R.V - Expectation - 46 Problem page 102 Find Φ X (ω) for the Poisson Φ X (ω) = E[e jωx ] = = k=0 = e b P {X = k}e jωx k=0 b bk e k! ejωk k=0 (be jω ) k k! note that e x = k=0 = e b e bejω = e b+bejω = e b(1 e jω ) x k k!
47 Chapter 3: Operations on One R.V - Expectation - 47 Problem page 100 X,σ 2 X for the Poisson X = E[X] = = k=0 kp {X = k} k=0 b bk ke k! = e b (0 + b 1 + 2b2 2! + 3b3 3! + 4b4 4! + + mbm m! + ) = e b b( b1 1! + b2 2! + b3 3! + + bm 1 (m 1)! + ) = e b be b = b
48 Chapter 3: Operations on One R.V - Expectation - 48 m 2 = E[X 2 ] = = k=0 k 2 P {X = k} k=0 k 2 b bk e k! = e b (0 + (1) b 1 + (22 ) b2 2! + (32 ) b3 3! + (42 ) b4 4! + + (m2 ) bm m! + ) = e b ( b 1 + 2b2 1! + 3b3 2! + 4b4 3! + + m bm (m 1)! + ) = e b b( (1 + 1)b1 1! + (2 + 1)b2 2! = e b b( b1 1! + b2 2! + b3 3! + + bm 1 (m 1)! (3 + 1)b3 3! + + (1 + (m 1)) b m 1 (m 1) (1) b1 1! + (2)b2 2! + (3)b3 3! + + (m 1) b m 1 (m 1)! + )
49 Chapter 3: Operations on One R.V - Expectation - 49 = e b b(e b + be b ) = b + b 2 Hence σ 2 X = m2 m2 1 = b + b 2 (b) 2 = b
50 Chapter 3: Operations on One R.V - Expectation - 50 Gaussian f X (x) = 1 e (x a) 2σ 2 X 2πσ 2 X 2 dφ X (ω) dω = e jωa ω 2 σ 2 X 2 (ja ωσx 2 ) Φ X (ω) = e jωa ω 2 σ 2 X 2 At ω = 0, m 1 = ( j) dφ X(ω) dω = ( j)(j)a = a. In the proof we need the relation that f(t) = e t2 F.I. 2σ 2 F(ω) = 2πσ 2 e σ2 ω 2 2
51 Chapter 3: Operations on One R.V - Expectation - 51 X U(0, 1),Y = X,P {0.1 Y 0.5} =? f Y (y) = f X(x) dy dx = 2 x = 2y(u(y) u(y 1) P {0.1 Y 0.5} = = f Y (y)dy 2ydy = y = = 0.24
52 Chapter 3: Operations on One R.V - Expectation - 52 Monte Carlo Integration-1 Integrate g(x) = x between x 1 = 0, x 2 = 1 % int_mc.m % Monte Carlo Integration % Integrate f(x)=x between 0 and 1. N=input( Enter # of points to use for simulation = ); p=rand(n,2); I=find(p(:,2) >= p(:,1)); area=length(i)/n
53 Chapter 3: Operations on One R.V - Expectation - 53 Monte Carlo Integration-2 % int_mc1.m % Integrate f(x)=xˆ2 between x1 and x2. N=input( # of points to use for simulation = ); p=rand(n,2); x1=2; x2=5; y1=x1*x1; y2=x2*x2; % scale p(:,1) to be from x1 to x2 p(:,1)=p(:,1)*(x2-x1)+x1; % scale p(:,2) to be from y1 to y2 p(:,2)=p(:,2)*(y2-y1)+y1; I=find(p(:,2)<=p(:,1).ˆ2); % y <= xˆ2 ratio=(length(i)/n); area=ratio*(x2-x1)*(y2-y1)+(x2-x1)*y1
54 Chapter 3: Operations on One R.V - Expectation - 54 Monte-Carlo Integration: Check results % as a check first plot the function f=inline( x.ˆ2, x ); ezplot(f) %integrate using quadl area=quadl(@(x) f(x),x1,x2) %=39
55 Chapter 3: Operations on One R.V - Expectation - 55 Solution of the Monte Carlo Integration HW Problem After plotting the curve, we notice that the curve is positive for values of x. (the zero of a function can be computed via the command fzero: fzero( 3*x.ˆ2-7*x.*sin(x.ˆ2-1)+6,[-1,2]) From the plot we find that the range of x = [ 1, 2], y = [ 10, 5] N=1000; X=-1+(2-(-1))*rand(N,1); % scale X to be from Y=2+(18-2)*rand(N,1); %scale Y to be y=3*x.ˆ2-7*x.*sin(x.ˆ2-1)+6; %calculated y for X I=find(Y < y); %find generated (random) below the curve (calc area=(length(i)/n)*(3*16)
56 Chapter 3: Operations on One R.V - Expectation - 56 Summary Items covered were: Expectation was defined in general for a random variable or some function of a random variable. Moments (about the origin, and central) were developed as valuable measures of a random variable s characteristics. Of particular value were mean value, variance, and skew. The characteristic function and moment generating function were given as convenient methods for finding moments. Methods were given to transform one random variable into another, and to find distribution and density functions of the new random variable. The important concepts of how to generate a specified random variable by computer were developed and an example and chapter-end problems were included that are based on use of MATLAB software.
3 Operations on One Random Variable - Expectation
3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationCh3 Operations on one random variable-expectation
Ch3 Operations on one random variable-expectation Previously we define a random variable as a mapping from the sample space to the real line We will now introduce some operations on the random variable.
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationLesson 3: Linear differential equations of the first order Solve each of the following differential equations by two methods.
Lesson 3: Linear differential equations of the first der Solve each of the following differential equations by two methods. Exercise 3.1. Solution. Method 1. It is clear that y + y = 3 e dx = e x is an
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationECE 302 Division 2 Exam 2 Solutions, 11/4/2009.
NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationF X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.
10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationContinuous r.v practice problems
Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationMATH4210 Financial Mathematics ( ) Tutorial 7
MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra
More informationMeasure-theoretic probability
Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationBell-shaped curves, variance
November 7, 2017 Pop-in lunch on Wednesday Pop-in lunch tomorrow, November 8, at high noon. Please join our group at the Faculty Club for lunch. Means If X is a random variable with PDF equal to f (x),
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationLecture 5: Expectation
Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More information04. Random Variables: Concepts
University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative
More informationPractice Final Exam Solutions
Important Notice: To prepare for the final exam, study past exams and practice exams, and homeworks, quizzes, and worksheets, not just this practice final. A topic not being on the practice final does
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationEE 302: Probabilistic Methods in Electrical Engineering
EE : Probabilistic Methods in Electrical Engineering Print Name: Solution (//6 --sk) Test II : Chapters.5 4 //98, : PM Write down your name on each paper. Read every question carefully and solve each problem
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationThe random variable 1
The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable
More informationdx n a 1(x) dy
HIGHER ORDER DIFFERENTIAL EQUATIONS Theory of linear equations Initial-value and boundary-value problem nth-order initial value problem is Solve: a n (x) dn y dx n + a n 1(x) dn 1 y dx n 1 +... + a 1(x)
More informationExpectation of Random Variables
1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete
More information1 Review of di erential calculus
Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationLinear second-order differential equations with constant coefficients and nonzero right-hand side
Linear second-order differential equations with constant coefficients and nonzero right-hand side We return to the damped, driven simple harmonic oscillator d 2 y dy + 2b dt2 dt + ω2 0y = F sin ωt We note
More informationLecture 1 Measure concentration
CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More information4 Pairs of Random Variables
B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More information1 Expectation of a continuously distributed random variable
OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0
ECONOMICS 07 SPRING 006 LABORATORY EXERCISE 5 KEY Problem. Solve the following equations for x. a 5x 3x + 8 = 9 0 5x 3x + 8 9 8 = 0(5x ) = 9(3x + 8), x 0 3 50x 0 = 7x + 7 3x = 9 x = 4 b 8x x + 5 = 0 8x
More informationProbability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.
Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationBasics on Probability. Jingrui He 09/11/2007
Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability
More informationCDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables
CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable
ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationChapter 3 Second Order Linear Equations
Partial Differential Equations (Math 3303) A Ë@ Õæ Aë áöß @. X. @ 2015-2014 ú GA JË@ É Ë@ Chapter 3 Second Order Linear Equations Second-order partial differential equations for an known function u(x,
More informationSolution to Assignment 3
The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The
More informationECE302 Spring 2015 HW10 Solutions May 3,
ECE32 Spring 25 HW Solutions May 3, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationProbability Theory Overview and Analysis of Randomized Algorithms
Probability Theory Overview and Analysis of Randomized Algorithms Analysis of Algorithms Prepared by John Reif, Ph.D. Probability Theory Topics a) Random Variables: Binomial and Geometric b) Useful Probabilistic
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationMidterm Examination. STA 205: Probability and Measure Theory. Wednesday, 2009 Mar 18, 2:50-4:05 pm
Midterm Examination STA 205: Probability and Measure Theory Wednesday, 2009 Mar 18, 2:50-4:05 pm This is a closed-book examination. You may use a single sheet of prepared notes, if you wish, but you may
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationAverages of Random Variables. Expected Value. Die Tossing Example. Averages of Random Variables
Averages of Random Variables Suppose that a random variable U can tae on any one of L random values, say u 1, u 2,...u L. Imagine that we mae n independent observations of U and that the value u is observed
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More information