BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Size: px
Start display at page:

Download "BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs"

Transcription

1 Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1

2 Function of Single Variable Theorem Suppose that X is a discrete random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let the solution be x = u(y). Then the probability distribution of the random variable y is P(Y = y) = f Y (y) = f X (u(y)) = P(X = u(y)) (1) 7.2

3 Function of Single Variable: Example Example Let X be a geometric random variable with probability distribution f X (x) = p(1 p) x 1, x = 1, 2,... Find the probability distribution of Y = X 2. Since X > 0, Y = X 2 (or X = Y) is one-to-one transformation. Therefore, the distribution of the random variable Y is f Y (y) = f X ( y) = p(1 p) y 1, y = 1, 4, 9,

4 Function of Two Theorem Suppose that X 1 and X 2 are two discrete random variables with joint probability distribution f X1 X 2 (x 1, x 2 ) (or f (x 1, x 2 )). Let Y 1 = h 1 (X 1, X 2 ) and Y 2 = h 2 (X 1, X 2 ) define one-to-one transformations between the points (x 1, x 2 ) and (y 1, y 2 ) so that equations y 1 = h 1 (x 1, x 2 ) and y 2 = h 2 (x 1, x 2 ) can be solved uniquely for x 1 and x 2 in terms of y 1 and y 2, i.e., x 1 = u 1 (y 1, y 2 ) and x 2 = u 2 (y 1, y 2 ). Then the joint probability distribution of Y 1 and Y 2 is f Y1,Y 2 (y 1, y 2 ) = f X1 X 2 [u 1 (y 1, y 2 ), u 2 (y 1, y 2 )] (2) Note that the distribution of Y 1 can be found by summing over the y 2, i.e., the marginal probability distribution of Y 1, f Y1 (y 1 ) = y 2 f Y1,Y 2 (y 1, y 2 ) 7.4

5 MGF of Sum Two Independent Poisson RVs I Example Suppose that X 1 and X 2 are two independent Poisson random variables with parameters λ 1 and λ 2. Find the probability distribution of Y = X 1 + X 2. Let Y = Y 1 = X 1 + X 2 and Y 2 = X 2. Hence, X 1 = Y 1 Y 2 and X 2 = Y 2. Since X 1 and X 2 are independent, the joint distribution of X 1 and X 2 is f X1 X 2 (x 1, x 2 ) = λx 1 1 e λ 1 λx 2 2 e λ 2 x 1! x 2! x 1 = 0, 1, 2,..., x 2 = 0, 1, 2,

6 MGF of Sum Two Independent Poisson RVs II The joint distribution of Y 1 and Y 2 is f Y1,Y 2 (y 1, y 2 ) = f X1 X 2 [u 1 (y 1, y 2 ), u 2 (y 1, y 2 )] = f X1 X 2 (y 1 y 2, y 2 ) = λy 1 y 2 1 e λ 1 (y 1 y 2 )! λy2 y 2! 2 e λ 2 = e (λ 1+λ 2 ) λ y 1 y 2 1 λ y 2 2 (y 1 y 2 )!y 2! y 1 = 0, 1, 2,..., y 2 = 0, 1, 2,..., y 1 Note that x 1 0 so that y 1 y

7 MGF of Sum Two Independent Poisson RVs III The marginal distribution of Y 1 is f Y1 (y 1 ) = y 1 y 2 =0 = e (λ 1+λ 2 ) = e (λ 1+λ 2 ) y 1! f Y1,Y 2 (y 1, y 2 ) y 1 y 2 =0 y 1 y 2 =0 λ y 1 y 2 1 λ y 2 2 (y 1 y 2 )!y 2! = e (λ 1+λ 2 ) (λ 1 + λ 2 ) y 1 y 1! y 1! (y 1 y 2 )!y 2! λy 1 y 2 1 λ y 2 2 The random variable Y = Y 1 = X 1 + X 2 has Poisson distribution with parameter λ 1 + λ

8 Function of Single Variable I Theorem Suppose that X is a continuous random variable with probability distribution f X (x). Let Y = h(x) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let the solution be x = u(y). Then the probability distribution of the random variable y is f Y (y) = f X (u(y)) J (3) where J = d dy u(y) = u (y) = dx dy is called the Jacobian of the transformation and the absolute value J of J is used. Suppose that the function y = h(x) is an increasing function of x. 7.8

9 Function of Single Variable II now, we have P(Y a) = P[X u(a)] = u(a) f X (x)dx Since x = u(y), we obtain dx = u (y)dy and P(Y a) = a f X (u(y))u (y)dy The density function of Y is f Y (y) = f X (u(y))u (y) = f X (u(y))j. If y = h(x) is a decreasing function of x, a similar argument holds. 7.9

10 Function of Single Variable: Example 1 I Example Suppose that X has pdf f (x) = { 2x, 0 < x < 1 0, elsewhere Let h(x) = 3X + 1. How to find the pdf of Y = h(x). (4) Since y = h(x) = 3x + 1 is an increasing function of x, we obtain x = u(y) = (y 1)/3, J = u (y) = 1/3, f Y (y) = f X (u(y))u (y) = 2(y 1)/3 1/3 = 2 (y 1)

11 Function of Single Variable: Example 1 II Intuitive approach F Y (y) = P(Y y) = P[(3X + 1) y] = P[X (y 1)/3] = (y 1)/3 f Y (y) = d dy F Y(y) 0 2xdx = [(y 1)/3] 2. = { 2 9 (y 1), 1 < y < 4 0, elsewhere 7.11

12 Function of Single Variable: Example 2 I Example Suppose that X has pdf f (x) = { 2x, 0 < x < 1 0, elsewhere (5) Let h(x) = e X. How to find the pdf of Y = h(x). Since y = h(x) = e x is a decreasing function of x, we obtain x = u(y) = ln y, J = u (y) = 1/y, f Y (y) = f X (u(y)) J = 2 ln y 1/y = 2 ln y, y 1/e < y <

13 Function of Single Variable: Example 2 II Intuitive approach F Y (y) = P(Y y) = P(e X y) = P(X ln y) = 1 ln y 2xdx = 1 ( ln y) 2. f Y (y) = d dy F Y(y) = 2 ln y, y 1/e < y <

14 Function of Single Variable: Non-monotonic Transformation I Example Suppose that X has pdf f (x). Let h(x) = X 2. How to find the pdf of Y = h(x). Since y = h(x) = x 2 is neither an increasing nor a decreasing function of x, we can not directly use the Eq. (3). We can obtain the pdf of Y = X 2 as follows: F Y (y) = P(Y y) = P(X 2 y) = P( y X y) = F X ( y) F X ( y) 7.14

15 Function of Single Variable: Non-monotonic Transformation II f Y (y) = d dy F Y(y) = d dy [F X( y) F X ( y)] = dx d dy dx [F X( y) F X ( y)] = f X( y) 2 y f X( y) 2 y 1 = 2 y [f X( y) + f X ( y)] 7.15

16 Non-monotonic Transformation: Normal vs Chi-Square I Example Suppose that X has pdf N(0, 1) Let h(x) = X 2. How to find the pdf of Y = h(x). X is standard normal distribution with pdf f (x) = 1 2π e x2 /2 We can obtain the pdf of Y = X 2 as follows: f Y (y) = 1 2 y [f X( y) + f X ( y)] = 1 y 1 2π e y/2 = 1 2 1/2 π y1/2 1 e y/2, y >

17 Non-monotonic Transformation: Normal vs Chi-Square II The gamma distribution has the pdf f (x; γ, λ) = λγ x γ 1 e λx Γ(γ) = λ Γ(γ) (λx)γ 1 e λx, x > 0. When λ = 1/2 and γ = 1/2, 1, 3/2, 2,..., it becomes chi-square distribution: f (x; γ, 1/2) = 1 ( ) 1 γ x γ 1 e x/2, x > 0. Γ(γ) 2 Since π = Γ( 1 2 ), we can rewrite f Y(y) as f Y (y) = 1 Γ( 1 2 ) Y has a chi-square distribution. ( ) 1 1/2 y 1/2 1 e y/

18 General Transformation Theorem Suppose that X is a continuous random variable with probability distribution f X (x), and Y = h(x) is a transformation that is not one-to-one. If the interval over which X is defined can be partitioned into m mutually exclusive disjoint sets such that each of the inverse functions x 1 = u 1 (y), x 2 = u 2 (y),..., x m = u m (y) of y = h(x) is one-to-one, the probability distribution of Y is f Y (y) = m f X [u i (y)] J i (6) i=0 7.18

19 Function of Two Theorem Suppose that X 1 and X 2 are two continuous random variables with joint probability distribution f X1 X 2 (x 1, x 2 ) (or f (x 1, x 2 )). Let Y 1 = h 1 (X 1, X 2 ) and Y 2 = h 2 (X 1, X 2 ) define one-to-one transformations between the points (x 1, x 2 ) and (y 1, y 2 ) so that equations y 1 = h 1 (x 1, x 2 ) and y 2 = h 2 (x 1, x 2 ) can be solved uniquely for x 1 and x 2 in terms of y 1 and y 2, i.e., x 1 = u 1 (y 1, y 2 ) and x 2 = u 2 (y 1, y 2 ). Then the joint probability distribution of Y 1 and Y 2 is f Y1,Y 2 (y 1, y 2 ) = f X1 X 2 [u 1 (y 1, y 2 ), u 2 (y 1, y 2 )] J (7) where J is the Jacobian and is given by the following determinant: J = x 1/ y 1 x 1 / y 2 x 2 / y 1 x 2 / y 2 (8) and the absolute value of the determinant is used. 7.19

20 Function of Two I Example Suppose that X 1 and X 2 are two independent exponential random variables with f X1 (x 1 ) = 2e 2x 1 and f X2 (x 2 ) = 2e 2x 2. Find the probability distribution of Y = X 1 /X 2. The joint probability distribution of X 1 and X 2 is f X1 X 2 (x 1, x 2 ) = 4e 2(x 1+x 2 ) Let Y = Y 1 = h 1 (X 1, X 2 ) = X 1 /X 2 and Y 2 = h 2 (X 1, X 2 ) = X 1 + X

21 Function of Two II The inverse solutions of y 1 = x 1 /x 2 and y 2 = x 1 + x 2 are x 1 = y 1 y 2 /(1 + y 1 ) and x 2 = y 2 /(1 + y 1 ), and it follows that y 2 y 1 (1+y J = 1 ) 2 (1+y 1 ) y 2 = y 2 (1 + y 1 ) 2 1 (1+y 1 ) 2 (1+y 1 ) Then the joint probability distribution of Y 1 and Y 2 is f Y1,Y 2 (y 1, y 2 ) = f X1 X 2 [u 1 (y 1, y 2 ), u 2 (y 1, y 2 )] J = 4e 2(y 1y 2 /(1+y 1 )+y 2 /(1+y 1 )) y 2 (1 + y 1 ) 2 = 4e 2y 2 y 2 (1 + y 1 ) 2 y 1 > 0, y 2 >

22 Function of Two III We need to find the distribution of Y = Y 1 = X 1 /X 2. The marginal distribution of Y 1 is f Y1 (y 1 ) = = 0 0 f Y1,Y 2 (y 1, y 2 )dy 2 4e 2y y 2 2 (1 + y 1 ) 2 dy 2 1 = (1 + y 1 ) 2, y > 0 = f Y (y) 7.22

23 Example: Function of Two N(0, 1) I Example Let X 1 and X 2 be two independent normal random variables and both have a N(0, 1) distribution. Let Y 1 = X 1+X 2 2 and Y 2 = X 2 X 1 2. Show that Y 1 and Y 2 are independent. Since X 1 and X 2 are independent, the joint pdf is f X1,X 2 (x 1, x 2 ) = f X1 (x 1 ) f X2 (x 2 ) = 1 1 exp 2 (x1 2+x2 2). 2π X 1 = u 1 (Y 1, Y 2 ) = Y 1 Y 2 and X 2 = u 2 (Y 1, Y 2 ) = Y 1 + Y 2. The Jacobian J = x 1/ y 1 x 1 / y 2 x 2 / y 1 x 2 / y 2 = =

24 Example: Function of Two N(0, 1) II The joint pdf of Y 1 and Y 2 is f Y1,Y 2 (y 1, y 2 ) = f X1 X 2 [u 1 (y 1, y 2 ), u 2 (y 1, y 2 )] J = 2 1 exp{ 2 [(y 1 y 2 ) 2 +(y 1 +y 2 ) 2 ]} 2π = = 1 ( 2π ) 2 ( ) 2 exp ( ) exp 2π ( ) exp 2π ( y2 1 ) 2 + y 2 2 ( ) ( y2 1 ) ( y2 2 )

25 Example: Function of Two N(0, 1) III Therefore, f Y1,Y 2 (y 1, y 2 ) = f Y1 (y 1 ) f Y2 (y 2 ). Y 1 and Y 2 both are normal distribution with N(0, 1 2 ). 7.25

26 Example: Function of Multiple I Example Let X 1, X 2,..., X n be independent and identically distributed random variables from a uniform distribution on [0, 1]. Let Y = max{x 1, X 2,..., X n } and Z = min{x 1, X 2,..., X n }. What are the pdf s of Y and Z? The pdf of Y: F Y (y) = P(Y y) = P(max{X 1, X 2,..., X n } y) = P(X 1 y,..., X n y) = P(X 1 y) P(X n y) = y n. 7.26

27 Example: Function of Multiple II Hence f y (y) = df Y(y) dy = ny n 1, 0 y 1. F Z (z) = P(Z z) = P(min{X 1, X 2,..., X n } z) = 1 P(X 1 > z,..., X n > z) = 1 P(X 1 > z) P(X n > z) = 1 (1 z) n. Hence f Z (z) = df Z(z) dz = n(1 z) n 1, 0 z

28 Function (mgf) Definition (Discrete Random Variable) Let X be a discrete random variable with probability mass function P(X = x i ) = f (x i ), i = 1, 2,.... The function M X is called the moment generating function of X is defined by M X (t) = E(e tx ) = e tx i f (x i ) (9) i=1 Definition (Continuous Random Variable) If X be a continuous random variable with probability density function f (x), we define the moment generating function of X by M X (t) = E(e tx ) = e tx f (x)dx (10) 7.28

29 MGF of Poisson Distribution Example If X is a Poisson random variable, f (x) = e λ λ x x!, with parameter λ > 0, find its moment generating function. Solution M X (t) = x=0 e tx e λ λ x x! = e λ (λe t ) x x! x=0 = e λ e λet = e λ(et 1) (11) 7.29

30 MGF of Exponential Distribution Example If X is an exponential random variable, f (x) = λe λx, with parameter λ > 0, find its moment generating function. Solution M X (t) = = λ = 0 0 e tx λe λx dx e x(λ t) dx λ λ t, t < λ (12) 7.30

31 MGF of Normal Distribution(I) Example If X N(0, 1), find its moment generating function. Solution M X (t) = E{e tx } = 1 2π = 1 e t2 /2 2π = e t2 /2 = e t2 /2 [ 1 2π e tx e x2 /2 dx e t2 /2 e tx e x2 /2 dx ] e (x t)2 /2 dx (13) 7.31

32 MGF of Gamma Distribution I Example If X is a Gamma distribution with pdf f (x; γ, λ) = { λ γ Γ(γ) xγ 1 e λx, 0 < x < 0, elsewhere, find its moment generating function. M X (t) = E{e tx } = = 0 e tx 0 λγ λ γ Γ(γ) xγ 1 e (λ t)x dx Γ(γ) xγ 1 e λx dx 7.32

33 MGF of Gamma Distribution II Let y = (λ t)x, then x = y/(λ t) and dx = dy/(λ t). We have M X (t) = = = = = 0 λ γ Γ(γ) xγ 1 e (λ t)x dx ( y λ γ 0 Γ(γ) (λ t) ( ) λ γ λ t ( ) λ γ λ t 1 (1 t/λ) γ 0 ) γ 1 e y 1 (λ t) dy 1 Γ(γ) yγ 1 e y dy 7.33

34 Properties of MGF Recall the Taylor series of the function e x : Thus e x = 1 + x + x2 2! + + xn n! + e tx = 1 + tx + (tx)2 2! + + (tx)n n! + The moment generating of the random variable X is M X (t) = E(e tx ) = E (1 + tx + (tx)2 2! = 1 + te(x) + t2 E(X 2 ) 2! + + (tx)n n! ) tn E(X n ) n!

35 Properties of MGF Consider the first derivative of M X (t) with respect to t and obtain M X(t) = E(X) + te(x 2 ) + + nt n 1 E(Xn ) n! Taking t = 0, we have M X(t) t=0 = M X(0) = E{X} = µ X + Consider the second derivative of M X (t) and obtain M X(t) = E(X 2 ) + te(x 3 ) + n(n 1)t n 2 E(Xn ) + n! The second moment of X is M X (0) = E{X2 }. Then, the variance of X is given by V(X) = E{X 2 } E 2 {X} = M X(0) (M X(0))

36 Properties of MGF Differentiating k times and then setting t = 0 yields M (k) X (t) t=0 = E{X k } (14) If X is a random variable and α is a constant, then M X+α (t) = e αt M X (t) (15) M αx (t) = M X (αt) (16) Let X and Y be two random variables with moment generating functions M X (t) and M Y (t), respectively. If M X (t) = M Y (t) for all values of t, then X and Y have the same probability distribution. 7.36

37 MGF of Normal Distribution(II) Example Use Eq. (13) to find the first four moments of the normal rv X. Solution M X (t) = d dt et2 /2 = e t2 /2 t = t M X (t). E{X 1 } = M X (0) = 0. M X (t) = M X(t) + t M X (t) = M X(t) + t 2 M X (t). E{X 2 } = M X (0) = 1. M (3) X (t) = (2t) M X(t)+(1+t 2 ) M X (t) = (3t+t3 ) M X (t). E{X 3 } = M (3) X (0) = 0. M (4) X (t) = (3 + 3t2 ) M X (t) + (3t + t 3 ) M X (t). E{X 4 } = M (4) X (0) =

38 MGF of Normal Distribution(III) Example Use Eqs. (14) and (15) to find the mgf of a normal distribution with mean µ and variance σ 2. Assume X be a normal distribution with mean µ and variance σ 2. If Z = X µ σ, Z is a standard normal random variable. The mgf of X is M X (t) = M σz+µ (t) = e µt M σz (t) = e µt M Z (σt) = e µt e (σt)2 /2 = e µt+σ2 t 2 /2 (17) 7.38

39 Sum of Independent RVs: MGF I Theorem If X 1, X 2,..., X n are independent random variables with moment generating functions M X1 (t), M X2 (t),..., M Xn (t), respectively, and if Y = X 1 + X X n then the moment generating function of Y is M Y (t) = M X1 (t) M X2 (t) M Xn (t) (18) For the continuous case, M Y (t) = E(e ty ) = E(e t(x 1+X 2 + +X n) ) = dx 1 dx 2 dx n e t(x 1+x 2 + +x n) f (x 1, x 2,..., x n ) 7.39

40 Sum of Independent RVs: MGF II Since the variables are independent, we have f (x 1, x 2,..., x n ) = f (x 1 )f (x 2 ) f (x n ) Therefore, M Y (t) = e tx 1 f (x 1 )dx 1 e txn f (x n )dx n = M X1 (t) M X2 (t) M Xn (t) e tx 2 f (x 2 )dx

41 MGF of Sum Two Independent Poisson RVs Example Suppose that X 1 and X 2 are two independent Poisson random variables with parameters λ 1 and λ 2. Find the probability distribution of Y = X 1 + X 2. The mgf s of X 1 and X 2 are M X1 (t) = e λ 1(e t 1) and M X2 (t) = e λ 2(e t 1), respectively. The mgf of Y is M Y (t) = M X1 (t) M X2 (t) = e λ 1(e t 1) e λ 2(e t 1) = e (λ 1+λ 2 )(e t 1) Therefore, Y is also a Poisson random variable with parameters λ 1 + λ 2. (19) 7.41

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

Lecture 3. David Aldous. 31 August David Aldous Lecture 3 Lecture 3 David Aldous 31 August 2015 This size-bias effect occurs in other contexts, such as class size. If a small Department offers two courses, with enrollments 90 and 10, then average class (faculty

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them: Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Expectation, variance and moments

Expectation, variance and moments Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

SOLUTION FOR HOMEWORK 11, ACTS 4306

SOLUTION FOR HOMEWORK 11, ACTS 4306 SOLUTION FOR HOMEWORK, ACTS 36 Welcome to your th homework. This is a collection of transformation, Central Limit Theorem (CLT), and other topics.. Solution: By definition of Z, Var(Z) = Var(3X Y.5). We

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

1.12 Multivariate Random Variables

1.12 Multivariate Random Variables 112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Statistics 3657 : Moment Generating Functions

Statistics 3657 : Moment Generating Functions Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

1.6 Families of Distributions

1.6 Families of Distributions Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40 Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

STAT 430/510 Probability

STAT 430/510 Probability STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009 Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional

More information

Introduction to Statistical Inference Self-study

Introduction to Statistical Inference Self-study Introduction to Statistical Inference Self-study Contents Definition, sample space The fundamental object in probability is a nonempty sample space Ω. An event is a subset A Ω. Definition, σ-algebra A

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Econ 508B: Lecture 5

Econ 508B: Lecture 5 Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1 APPM/MATH 4/552 Solutions to Problem Set Two. Let Y X /X 2 and let Y 2 X 2. (We can select Y 2 to be anything but when dealing with a fraction for Y, it is usually convenient to set Y 2 to be the denominator.)

More information

Random Vectors and Multivariate Normal Distributions

Random Vectors and Multivariate Normal Distributions Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

The Multivariate Normal Distribution 1

The Multivariate Normal Distribution 1 The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes. Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

1 Expectation of a continuously distributed random variable

1 Expectation of a continuously distributed random variable OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Solution to Assignment 3

Solution to Assignment 3 The Chinese University of Hong Kong ENGG3D: Probability and Statistics for Engineers 5-6 Term Solution to Assignment 3 Hongyang Li, Francis Due: 3:pm, March Release Date: March 8, 6 Dear students, The

More information

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84 Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Continuous Distributions

Continuous Distributions Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6 MTH739U/P: Topics in Scientific Computing Autumn 16 Week 6 4.5 Generic algorithms for non-uniform variates We have seen that sampling from a uniform distribution in [, 1] is a relatively straightforward

More information