Lecture 5: Expectation

Size: px
Start display at page:

Download "Lecture 5: Expectation"

Transcription

1 Lecture 5: Expectation 1. Expectations for random variables 1.1 Expectations for simple random variables 1.2 Expectations for bounded random variables 1.3 Expectations for general random variables 1.4 Expectation as a Lebesgue integral 1.5 Riemann and Riemann-Stiltjes integral 2. Expectation and distribution of random variables 2.1 Expectation for transformed discrete random variables 2.2 Expectation of transformed continuous random variables 2.3 Expectation for a product of independent random variables 2.4 Moments of higher order 1. Expectations for random variables 1.1 Expectations for simple random variables < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable. P =< A 1,..., A n > is a partition of the sample space Ω, i.e., a family of sets such that (a) A 1,..., A n F, (b) A 1 A n = Ω, (c) A i A j =, i j. Definition 5.1. A random variable X is called a simple random variable if there exists a partition P = {A 1,..., A n } of 1

2 the sample space Ω and real numbers x 1,..., x n such that X(ω) = n x i I Ai (ω) = i=1 x 1 if ω A 1,... x n if ω A n. Definition 5.2. If X is a simple random variable, then its expectation (expected value) is defined as EX = n x i P (A i ). i=1 Notations that are often used EX = E[X] = E(X). Examples (1) Let X = X(ω) = M, ω Ω, where M is a constant. Since < Ω > is a partition, EX = MP (Ω) = M. (2) Let I A = I A (ω) is a indicator of a random event A, i.e. a random variables that takes values 1 and 0 on the sets A and A, respectively. Since < A, A > is a partition, EI A = 1P (A) + 0P (A) = P (A). Expectation EX of a simple random variable always exists (take a finite value) and possesses the following properties: (1) If X = n i=1 x i I Ai and Y = m j=1 y j I Bj are two simple random variables and a and b are any real numbers, then Z = ax + by is also a simple random variable and EZ = aex + bey. 2

3 (a) If {A 1,..., A n } and {B 1,..., B m } are two partition of Ω then {A i B j, i = 1,..., n, j = 1,..., m} is also a partition of of Ω; (b) Z = ax + by = n i=1 mj=1 (ax i + by j )I Ai B j ; (c) EZ = n i=1 mj=1 (ax i + by j )P (A i B j ) = n i=1 mj=1 ax i P (A i B j ) + n i=1 mj=1 by j P (A i B j ) = a n i=1 x i mj=1 P (A i B j ) + b m j=1 y j ni=1 P (A i B j ) = a n i=1 x i P (A i ) + b m j=1 y j P (B j ) = aex + bey. (2) If X = n i=1 x i I Ai is a simple random variable such that P (X 0) = 1 then EX 0. P (X 0) = 1 implies that P (A i ) = 0 if x i < 0. In this case EX = i:x i 0 x i P (A i ) 0. (2 ) If X and Y are two simple random variables such that P (X Y ) = 1 then EX EY. P (X Y ) = 1 P (Y X 0) = 1 E(Y X) = EY EX 0. 3

4 1.2 Expectations for bounded random variables < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable. Definition. A random variable X is bounded if there exists a constant M such that X(ω) M for every ω Ω. Examples (1) If Ω = {ω 1,..., ω N } is a finite sample space that any random variable X = X(ω) is bounded. (2) If Ω = {ω = (ω 1, ω 2,...), ω i = 0, 1, i = 1, 2,...} is the sample space for infinite series of Bernoulli trials then the random variable X = X(ω) = min(n 1 : ω n = 1) (the number of the first successful trial) is an unbounded random variable while the random variable Y = Y (ω) = ω ω n (the number of successes in first n trials is a bounded random variable. Definition. If X is a bounded random variable, then its expectation is defined as EX = sup EX = inf X X X X EX, where supremum is taken over simple random variables X X while infimum is taken over simple random variables X X. To be sure that the definition is meaningful we should prove that sup and inf in the above definition are equal. 4

5 ( a) The inequality sup X X EX inf X X EX holds because of any two simple random variables X X and X X are connected by the relation X X and therefore EX EX. (b) Let X(ω) < M. Fix a number n and define A i = {ω Ω : (i 1)M n < X(ω) im n }, n i n. Note that A i F, i = n,..., n. Define the simple random variables X n = n i= n (i 1)M n I Ai, X n = n i= n im n I A i. By the definition, X n X X n. Moreover, X n X n = M n and, therefore EX n EX n = M n. Thus, inf X X EX EX n = EX n + M n sup EX + M X X n. Since, n is an arbitrary, the relation above implies that inf X X EX sup EX. X X (c) By (a) and (b) we get sup X X EX = inf X X EX. Expectation of a bounded random variable EX always exists (take a finite value) and possess the properties similar with those for expectation of a simple random variable: 5

6 (1) If X and Y are two bounded random variables and a and b are any real numbers, then Z = ax + by is also a bounded random variable and EZ = aex + bey. (a) Let first prove that EaX = aex. The case a = 0 is trivial. The case a < 0 is reduced to the case a > 0 by considering the random variable X. If a > 0, then EaX = sup ax ax EaX = sup X X aex = a sup X X EX = aex. (b) The prove in (1) can be reduced to the case a = b = 1 by considering the random variables ax and by. We have sup Z Z=X+Y EZ sup X X,Y Y E(X + Y ), since X X and Y Y implies Z = X + Y Z = X + Y and thus the supremum on the right hand side in the above inequality is actually taken over a smaller set. (c) Using (b) we get EZ = E(X + Y ) EX + EY. Indeed, EZ = E(X + Y ) = sup Z Z=X+Y EZ sup X X,Y Y = sup (EX + EY ) = sup EX + sup X X,Y Y X X Y Y E(X + Y ) EY = EX + EY. (d) The reverse inequality follows by considering the random variables X and Y. 6

7 (2) If X is a bounded random variable such that P (X 0) = 1 then EX 0. Let denote A = {ω : X(ω) 0}. Let also M ba a constant that bounds X. Then X(ω) X 0, ω Ω where X 0 = 0I A (ω) + ( M)I A (ω) = MI A (ω) is a simple random variable. Then EX = sup EX EX 0 = MP (A) = 0. X X (2 ) If X and Y are two bounded random variables such that P (X Y ) = 1 then EX EY. 1.3 Expectations for general random variables < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable. Definition. If X = X(ω) 0, ω Ω, i.e., X is a nonnegative random variable, then EX = sup EX, X X where supremum is taken over all bounded random variables such that 0 X X. The expectation EX of a non-negative random variable can take non-negative values or to be equal to infinity. 7

8 Any random variable X can be decomposed in the difference of two non-negative random variables X + = max(x, 0) and X = max( X, 0) that is X = X + X. Definition. If X is integrable, i.e., E X < then the its expectation is defined as, EX = EX + EX. Definition is correct since 0 X +, X X and since X is an integrable, 0 EX +, EX <. Expectation of a random variable EX possess the properties similar with those for expectation of a simple and bounded random variables: (1) If X and Y are two integrable random variables and a and b are any real numbers, then Z = ax + by is also an integrable random variable and EZ = aex + bey. (a) Let first prove that EaX = aex for the case where a 0 and X 0 and one should count the product aex = 0 if a = 0, EX = and aex = if a > 0, EX =. The case a = 0 is trivial since in this case ax 0 and therefore EaX = 0. If a > 0 then EaX = sup ax ax EaX = sup X X aex = a sup X X EX = aex. (b) Let first prove that EaX = aex for an integrable random variable X. In this case, the case a 0 can be reduced to the 8

9 case a 0 by considering the random variable X. If a > 0 then EaX = E(aX) + E(aX) = aex + aex = aex. (c) The prove in (1) for X, Y 0 can be reduced to the case a = b = 1 by considering the random variables ax and by. We have sup Z Z=X+Y EZ sup X X,Y Y E(X + Y ), since X X and Y Y implies Z = X + Y Z = X + Y and thus the supremum on the right hand side in the above inequality is actually taken over a smaller set. (d) Using (c) we get EZ = E(X +Y ) EX +EY for X, Y 0. Indeed EZ = E(X + Y ) = sup Z Z=X+Y EZ sup X X,Y Y = sup (EX + EY ) = sup EX + sup X X,Y Y X X Y Y E(X + Y ) EY = EX + EY. (e) To prove EZ = E(X + Y ) EX + EY for X, Y 0 let us the inequality for non-negative bounded random variables min(x + Y, n) min(x, n) + min(y, n). This implies and in sequel E min(x + Y, n) E min(x, n) + E min(y, n). EZ = E(X + Y ) = = max n 1 sup Z Z=X+Y EZ = max n 1 sup Z min(x+y,n) EZ E min(x + Y, n) max(e min(x, n) + E min(y, n)) n 1 9

10 max n 1 E min(x, n) + max E min(y, n) = EX + EY. n 1 (f) Finally, to prove EZ = E(X + Y ) = EX + EY for arbitrary integrable random variables X and Y let us define a random variable Z with the positive part Z + = X + + Y + and the negative part Z = X + Y. We have E(X + Y ) = E(X + X + Y + Y ) = E(Z + Z ) = EZ + EZ = (EX + + EY + ) (EX + EY ) = EX + EY. (2) If X is a random variable such that P (X 0) = 1 then EX 0. Since X 0 0 is a non-negative bounded random variable EX = sup EX EX 0 = 0. X X (2 ) If X and Y are two random variables such that P (X Y ) = 1 then EX EY. 1.4 Expectation as a Lebesgue integral < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable defined on the probability space < Ω, F, P >. 10

11 In fact, EX, as it was defined above, is the Lebesgue integral for the real-valued function X = X(ω) with respect to measure P (A) and, therefore, according notations used in the integration theory, EX = X(ω)P (dω). Ω Also the following notations are used EX = Ω Ω X(ω)P (dω) = XdP = XdP. Definition 5.3. A finite measure Q(A) defined on σ-algebra F is a function that can be represented as Q(A) = qp (A), where P (A) is a probability measure defined on F and q > 0 is a positive constat. Definition 5.4. The Lebesque integral Ω XdQ is defined by the following formula Ω XdQ = q Ω XdP. Examples (1) Lebesque measure m(a) on the Borel σ-algebra of an interval [c, d] which is uniquely determined by its values on intervals m((a, b]) = b a, c a b d. It can be represented in the form m(a) = qp (A) where q = d c and P (A) is a probability measure on the Borel σ-algebra of an interval 11

12 [c, d], which is uniquely determined by its values on intervals P ((a, b]) = b a d c, c a b d. (2) According the above definition [c,d] Xdm = d c Xdm = q [c,d] XdP = qex, where X should be considered as a random variable defined on the probability space < Ω = [c, d], F = B([c, d]), P (A) >. Definition. A σ-finite measure Q(A) defined on σ-algebra F is a function of sets for which there exists a sequence of Ω n F, Ω n Q n+1, n = 1, 2,..., n Ω n = Ω such that Q(Ω n ) <, n = 1, 2,... and Q(A) = lim n Q(A Ω n ). Definition. The Lebesque integral Ω XdQ is defined for a random variable X = X(ω) and a σ-finite measure Q, under condition that Ω n X dq <, n = 1, 2,... and lim n Ω n X dq <, by the following formula Examples Ω XdQ = lim XdQ. n Ω n (1) Lebesque measure m(a) on the Borel σ-algebra of an interval R 1 which is uniquely determined by its values on intervals m((a, b]) = b a, a b. It can be represented in the form m(a) = lim n m(a [ n, n]), where m(a [ n, n]) is Lebesgue measure on the interval [ n, n] for every n. (2) According the above definition R 1 Xdm = Xdm = lim n n n Xdm under condition that n n X dm <, n = 1, 2,... and lim n n n X dm <. 12

13 1.5 Riemann amd Riemann-Stiltjes integrals f(x) is a real valued function defined on a real line; [a, b]; a = x n,0 < x n,1 < < x n,n = b; d(n) = max 1 k n (x n,k x n,k 1 ) 0 as n ; x n,k [x n,k 1, x n,k ], k = 1,..., n, n = 1, 2,...; S n = n f(x n,k)(x n,k x n,k 1 ). k=1 Definition 5.5 Riemann integral b a f(x)dx exists if and only if there exists the same lim n S n for any choice of partitions such that d(n) 0 as n and points x n,k. In this case b a f(x)dx = lim n S n. Definition 5.6 If function f is bounded and Riemann integrable on any finite interval, and lim n n n f(x) dx <, then function f is Riemann integrable on real line and f(x)dx = lim n n n f(x)dx. Theorem 5.1*. A real-valued bounded Borel function f(x) defined on a real line is Riemann integrable on [a, b] if and only if its set of discontinuity points R f [a, b] has Lebesgue measure m(r f [a, b]) = 0. Theorem 5.2*. If Ω = R 1, and F = B 1 and f = f(x) is a Riemann integrable function, i.e., f(x) dx <. Then the 13

14 Lebesgue integral f(x) m(dx) < and f(x)dx = f(x)m(dx). Example Let D be the set of all irrational points in interval [a, b]. The function I D (x), a x b is a bounded Borel function which is discontinuous in all points of the interval [a, b]. It is not Riemann integrable. But it is Lebesgue integrable since it is a simple function and [a,b] I D (x)m(dx) = 0 m([a, b] \ D) + 1 m(d) = (b a) = b a. f(x) is a real valued function defined on a real line; G(t) is a real-valued non-decreasing and continuous from the right function defined on a real line; G(A) be a measure uniquely defined by function G(x) by relations G((a, b]) = G(b) G(a), < a b <. [a, b]; a = x n,0 < x n,1 < < x n,n = b; d(n) = max 1 k n (x n,k x n,k 1 ) 0 as n ; x n,k [x n,k 1, x n,k ], k = 1,..., n, n = 1, 2,...; S n = n f(x n,k)(g(x n,k ) G(x n,k 1 ). k=1 Definition 5.7 Riemann-Stiltjes integral b a f(x)dg(x) exists if and only if there exists the same lim n S n for any choice of partitions such that d(n) 0 as n and points x n,k. In this case b f(x)dg(x) = lim S n n. a 14

15 Definition 5.8 If function f is bounded and Riemann-Stiltjes integrable on any finite interval, and lim n n n f(x) dg(x) <, then function f is Riemann-Stiltjes integrable on real line and f(x)dg(x) = lim n n n f(x)dg(x). Theorem 5.3*. A real-valued bounded Borel function f(x) defined on a real line is Riemann-Stiltjes integrable on [a, b] if and only if its set of discontinuity points R f [a, b] has the measure G(R f [a, b]) = 0. Theorem 5.4*. If Ω = R 1, and F = B 1 and f = f(x) is a Riemann-Stiltjes integrable function, i.e., f(x) dg(x) <. Then the Lebesgue integral f(x) G(dx) < and f(x)dg(x) = f(x)g(dx). 2. Expectation and distribution of random variables 2.1 Expectation for transformed discrete random variables < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable defined on the probability space < Ω, F, P >. g(x) is a Borel real-valued function defined on a real line. Y = g(x) is a transformed random variable. 15

16 Definition 5.9. A random variable X is a discrete random variable if there exists a finite or countable set of real numbers {x n } such that n p X (x n ) = 1, where p X (x n ) = P (X = x n ) = 1. Theorem 5.5**. Let X be a discrete random variable. Then Examples EY = Eg(X) = Ω g(x(ω))p (dω) = n g(x n )p X (x n ). (1) Let Ω = {ω 1,..., ω N } is a discrete sample space, F = F 0 is the σ-algebra of all subsets of Ω and P (A) is a probability measure, which is given by the formula P (A) = ω i A p i, where p(ω i ) = P (A i ) 0, i = 1,... N are probabilities of one-points events A i = {ω i } satisfying the relation ω i Ω p(ω i ) = 1. A random variable X = X(ω) and a transformed random variable Y = g(x) are, in this case, simple random variables since < A 1,... A N > is a partition of Ω and X = ω i Ω X(ω i )I Ai and Y = ω i Ω g(x(ω i ))I Ai. In this case, p X (x j ) = P (X = x j ) = p(ω i ) ω i :X(ω i )=x j and, according the definition of expectation and Theorem 1, EY = Eg(X) = ω i Ω g(x(ω))p(ω i ) = n g(x n )p X (x n ). (2) Let Ω = {ω = (ω 1,..., ω n )}, ω i = 0, 1, i = 1,..., n} is a discrete sample space, for series of n Bernoulli trials. In this case p(ω) = n i=1 p ω i q1 ω i where p, q > 0, p + q = 1. 16

17 Let X(ω) = ω ω n be the number of successes in n trials. In this case,x j = j, j = 0,..., n and p X (j) = P (X = j) = where Cn j = n! and Theorem 3, j!(n j)! ω:x(ω)=j p j q n j = C j np j q n j, j = 0,..., n,, and, according the definition of expectation EX = X(ω)p(ω) = n jp X (j) = np. ω Ω j=0 (3) Let X is a Poisson random variable, i.e., P X (n) = e λ λ n n!, n = 0, 1,.... Then, EX = n e λ λ n = λ. n! n=0 2.2 Expectation for transformed continuous random variables < Ω, F, P > is a probability space; X = X(ω) is a real valued random variable defined on the probability space < Ω, F, P >. P X (A) = P (X A) and F X (x) = P (X x) are, respectively, the distribution and the distribution function for the random variable X. g(x) is a Borel real-valued function defined on a real line. Y = g(x) is a transformed random variable. 17

18 Theorem 5.6**. Let X be a random variable. Then EY = Eg(X) = Ω g(x(ω))p (dω) = g(x)p X(dx). Definition 5.10 A random variable X is a continuous random variable if there exists a non-negative Borel function f X (x) defined on a real line such that f(x)m(dx) = 1 such that F X (x) = x f X(y)m(dy), x R 1. The function f X (x) is called the probability density of the random variable X (or the distribution function F X (x)). According Theorem 5.2, if f X (x) is a Riemann integrable function, i.e., f(x)dx = 1 then F X (x) = x f X(y)m(dy) = x f X(y)dy, x R 1. Theorem 5.7**. Let X be a continuous random variable with a probability density f. Then = EY = Eg(X) = g(x)p X(dx) = Ω g(x(ω))p (dω) g(x)f(x)m(dx). According Theorem 5.2, if g(x)f X (x) is a Riemann integrable function, i.e., g(x)f X (x) dx < then EY = Eg(X) = Ω g(x(ω))p (dω) = g(x)p X(dx) 18

19 = g(x)f X(x)m(dx) = g(x)f X(x)dx. Examples (1) Let Ω = [0, T ] [0, T ], F = B(Ω), m(a) is the Lebesgue measure on B(Ω), which is uniquely determined by its values on rectangles m([a, b] [c, d]) = (b a)(d c) (m(a) is the area of a Borel set A). Let also the corresponding probability measure P (A) = m(a) T. Let the random variable X(ω) = ω 2 1 ω 2, ω = (ω 1, ω 2 ) Ω. Find the EX. (1 ) EX = 1 T 2 = 1 T 2 Ω ω 1 ω 2 m(dω) [0,T ] [0,T ] ω 1 ω 2 dω 1 dω 2 =? (1 ) The distribution function F X (x) = P (X x) = T 2 (T x) 2 T 2 = 1 (1 x T )2, 0 x T. It has a continuous (and, therefore, Riemann integrable probability density) f X (x) = 2 T (1 x T ), 0 x T. Thus, EX = T 0 x 2 T (1 x T )dx = T 3. (2) Let X = X(ω) be a random variable defined on a probability space < Ω, F, P > with the distribution function F (x) = P (X x) and the distribution F (A) = P (X A). Then EX = Ω X(ω)P (dω) = xf (dx) = xdf (x) R 1. (3) Let X be a non-negative random variable. Then the above formula can be transformed to the following form EX = [0, ) xf (dx) = xdf (x) = (1 F (x))dx

20 (a) 0 xdf (x) = lim A0 A xdf (x); (b) 0 (1 F (x))dx = lim A A0 (1 F (x))dx; (c) A 0 xdf (x) = A(1 F (A)) + A 0 (1 F (x))dx; (d) 0 (1 F (x))dx < 0 (e)a(1 F (A)) A xdf (x); xdf (x) < (f) 0 xdf (x) < 0 (1 F (x))dx < ; 2.3 Expectation for product of independent random variables Theorem 5.8. If X and Y are two independent random variables and E X, E Y <. Then E XY < and EXY = EXEY. (g) (a) - (c) 0 xdf (x) = 0 (1 F (x))dx. (a) Let X = n i=1 x i I Ai and Y = m j=1 y j I Bj are two simple independent random variables. Then XY = n mj=1 i=1 x i y j I Ai B j is also a simple random variable and, therefore, EXY = n m i=1 j=1 = n i=1 x i y j P (A i B j ) = n m i=1 j=1 x i P (A i ) m y j P (B j ) = EXEY. j=1 x i y j P (A i )P (B j ) 20

21 (b) The proof for bounded and general random variables analogous to those proof given for the linear property of expectations. 2.4 Moments of higher order Let X = X(ω) be a random variable defined on a probability space < Ω, F, P > with the distribution function F X (x) = P (X x) and the distribution F X (A) = P (X A); Let also Y = X n and the distribution function F Y (y) = P (X n y) and the distribution F Y (A) = P (X n A); Definition 5.11 The moment of the order n for the random variable X is the expectation of random variable Y = X n. EX n = Ω X(ω)n P (dω) = = LN Problems R 1 yf Y (dy) = R 1 x n F X (dx) = ydf Y (y) xn df X (x) 1. Let X is a discrete random variable that taking nonnegative integer values 0, 1, 2,.... Prove that EX = n=1 P (X n). 2. Let X is a non-negative random variable and F (x) = P (X x). Prove that EX n = n x n 1 (1 F (x))dx. 0 21

22 3. Let X be a geometric random variable that take values n = 0, 1,... with probabilities P (X = n) = qp n 1, n = 0, 1,.... Please find: (a) P (X n); (b) EX. 4. The random variable X has a Poisson distribution with parameter λ > 0. Please find E 1 1+X. 5 Let X 1,..., X n be independent random variables uniformly distributed in the interval [0, T ] and Z n = max(x 1,..., X n ). Please find: (a)p (Z n x), (b) EZ n, (c) E(Z n T ) 2. 6 Let X 1,..., X n be independent random variables uniformly distributed in the interval [0, T ] and Y n = 2 X 1+ +X n n. Please find: (a) EY n ; (b) E(Y n T ) 2. 7 Let V arx = E(X EX) 2 <. Please prove that (a) V arx = EX 2 (EX) 2, (b) V arx = inf a R1 E(X a) 2. 8 Let X and Y are independent random variables with V arx, V ary <. Please prove that V ar(x + Y ) = V arx +V ary. 9 Let X 0 is a continuous non-negative random variable with EX 2 <. Please prove that EX2 = 0 x 2 f X (x)dx = 2 0 x(1 F X (x))dx. 10 Let a random variable X has an exponential distribution F X (x) = I(x 0)(1 e λx ). Please find EX and V arx = E(X EX) 2. 22

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Lecture 2: Random Variables and Expectation

Lecture 2: Random Variables and Expectation Econ 514: Probability and Statistics Lecture 2: Random Variables and Expectation Definition of function: Given sets X and Y, a function f with domain X and image Y is a rule that assigns to every x X one

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Probability: Handout

Probability: Handout Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

MATH4210 Financial Mathematics ( ) Tutorial 7

MATH4210 Financial Mathematics ( ) Tutorial 7 MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

ABSTRACT EXPECTATION

ABSTRACT EXPECTATION ABSTRACT EXPECTATION Abstract. In undergraduate courses, expectation is sometimes defined twice, once for discrete random variables and again for continuous random variables. Here, we will give a definition

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables EE50: Probability Foundations for Electrical Engineers July-November 205 Lecture 2: Expectation of CRVs, Fatou s Lemma and DCT Lecturer: Krishna Jagannathan Scribe: Jainam Doshi In the present lecture,

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

1 Measurable Functions

1 Measurable Functions 36-752 Advanced Probability Overview Spring 2018 2. Measurable Functions, Random Variables, and Integration Instructor: Alessandro Rinaldo Associated reading: Sec 1.5 of Ash and Doléans-Dade; Sec 1.3 and

More information

3 Operations on One Random Variable - Expectation

3 Operations on One Random Variable - Expectation 3 Operations on One Random Variable - Expectation 3.0 INTRODUCTION operations on a random variable Most of these operations are based on a single concept expectation. Even a probability of an event can

More information

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write

Lecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write Lecture 3: Expected Value 1.) Definitions. If X 0 is a random variable on (Ω, F, P), then we define its expected value to be EX = XdP. Notice that this quantity may be. For general X, we say that EX exists

More information

Elementary Probability. Exam Number 38119

Elementary Probability. Exam Number 38119 Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of

More information

1. Probability Measure and Integration Theory in a Nutshell

1. Probability Measure and Integration Theory in a Nutshell 1. Probability Measure and Integration Theory in a Nutshell 1.1. Measurable Space and Measurable Functions Definition 1.1. A measurable space is a tuple (Ω, F) where Ω is a set and F a σ-algebra on Ω,

More information

Real Analysis Problems

Real Analysis Problems Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.

More information

Entrance Exam, Real Analysis September 1, 2017 Solve exactly 6 out of the 8 problems

Entrance Exam, Real Analysis September 1, 2017 Solve exactly 6 out of the 8 problems September, 27 Solve exactly 6 out of the 8 problems. Prove by denition (in ɛ δ language) that f(x) = + x 2 is uniformly continuous in (, ). Is f(x) uniformly continuous in (, )? Prove your conclusion.

More information

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Review of Probability Theory II

Review of Probability Theory II Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

4 Expectation & the Lebesgue Theorems

4 Expectation & the Lebesgue Theorems STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does

More information

A List of Problems in Real Analysis

A List of Problems in Real Analysis A List of Problems in Real Analysis W.Yessen & T.Ma December 3, 218 This document was first created by Will Yessen, who was a graduate student at UCI. Timmy Ma, who was also a graduate student at UCI,

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

I. ANALYSIS; PROBABILITY

I. ANALYSIS; PROBABILITY ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Chapter 1. Sets and probability. 1.3 Probability space

Chapter 1. Sets and probability. 1.3 Probability space Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Analysis Qualifying Exam

Analysis Qualifying Exam Analysis Qualifying Exam Spring 2017 Problem 1: Let f be differentiable on R. Suppose that there exists M > 0 such that f(k) M for each integer k, and f (x) M for all x R. Show that f is bounded, i.e.,

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

PROBABILITY AND STATISTICS IN COMPUTING. III. Discrete Random Variables Expectation and Deviations From: [5][7][6] German Hernandez

PROBABILITY AND STATISTICS IN COMPUTING. III. Discrete Random Variables Expectation and Deviations From: [5][7][6] German Hernandez Conditional PROBABILITY AND STATISTICS IN COMPUTING III. Discrete Random Variables and Deviations From: [5][7][6] Page of 46 German Hernandez Conditional. Random variables.. Measurable function Let (Ω,

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration Random experiment: uncertainty in outcomes Ω: sample space: a set containing all possible outcomes Definition

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Math212a1413 The Lebesgue integral.

Math212a1413 The Lebesgue integral. Math212a1413 The Lebesgue integral. October 28, 2014 Simple functions. In what follows, (X, F, m) is a space with a σ-field of sets, and m a measure on F. The purpose of today s lecture is to develop the

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi Real Analysis Math 3AH Rudin, Chapter # Dominique Abdi.. If r is rational (r 0) and x is irrational, prove that r + x and rx are irrational. Solution. Assume the contrary, that r+x and rx are rational.

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Math 127C, Spring 2006 Final Exam Solutions. x 2 ), g(y 1, y 2 ) = ( y 1 y 2, y1 2 + y2) 2. (g f) (0) = g (f(0))f (0).

Math 127C, Spring 2006 Final Exam Solutions. x 2 ), g(y 1, y 2 ) = ( y 1 y 2, y1 2 + y2) 2. (g f) (0) = g (f(0))f (0). Math 27C, Spring 26 Final Exam Solutions. Define f : R 2 R 2 and g : R 2 R 2 by f(x, x 2 (sin x 2 x, e x x 2, g(y, y 2 ( y y 2, y 2 + y2 2. Use the chain rule to compute the matrix of (g f (,. By the chain

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg Metric Spaces Exercises Fall 2017 Lecturer: Viveka Erlandsson Written by M.van den Berg School of Mathematics University of Bristol BS8 1TW Bristol, UK 1 Exercises. 1. Let X be a non-empty set, and suppose

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

1 Probability space and random variables

1 Probability space and random variables 1 Probability space and random variables As graduate level, we inevitably need to study probability based on measure theory. It obscures some intuitions in probability, but it also supplements our intuition,

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Lecture 7. Sums of random variables

Lecture 7. Sums of random variables 18.175: Lecture 7 Sums of random variables Scott Sheffield MIT 18.175 Lecture 7 1 Outline Definitions Sums of random variables 18.175 Lecture 7 2 Outline Definitions Sums of random variables 18.175 Lecture

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I Contents 1. Preliinaries 2. The ain result 3. The Rieann integral 4. The integral of a nonnegative

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Problem set 1, Real Analysis I, Spring, 2015.

Problem set 1, Real Analysis I, Spring, 2015. Problem set 1, Real Analysis I, Spring, 015. (1) Let f n : D R be a sequence of functions with domain D R n. Recall that f n f uniformly if and only if for all ɛ > 0, there is an N = N(ɛ) so that if n

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N

d(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017 Probability Notes Compiled by Paul J. Hurtado Last Compiled: September 6, 2017 About These Notes These are course notes from a Probability course taught using An Introduction to Mathematical Statistics

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

JORDAN CONTENT. J(P, A) = {m(i k ); I k an interval of P contained in int(a)} J(P, A) = {m(i k ); I k an interval of P intersecting cl(a)}.

JORDAN CONTENT. J(P, A) = {m(i k ); I k an interval of P contained in int(a)} J(P, A) = {m(i k ); I k an interval of P intersecting cl(a)}. JORDAN CONTENT Definition. Let A R n be a bounded set. Given a rectangle (cartesian product of compact intervals) R R n containing A, denote by P the set of finite partitions of R by sub-rectangles ( intervals

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

REAL VARIABLES: PROBLEM SET 1. = x limsup E k

REAL VARIABLES: PROBLEM SET 1. = x limsup E k REAL VARIABLES: PROBLEM SET 1 BEN ELDER 1. Problem 1.1a First let s prove that limsup E k consists of those points which belong to infinitely many E k. From equation 1.1: limsup E k = E k For limsup E

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Lecture 4: Fourier Transforms.

Lecture 4: Fourier Transforms. 1 Definition. Lecture 4: Fourier Transforms. We now come to Fourier transforms, which we give in the form of a definition. First we define the spaces L 1 () and L 2 (). Definition 1.1 The space L 1 ()

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information

International Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994

International Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994 International Competition in Mathematics for Universtiy Students in Plovdiv, Bulgaria 1994 1 PROBLEMS AND SOLUTIONS First day July 29, 1994 Problem 1. 13 points a Let A be a n n, n 2, symmetric, invertible

More information