Probability and Distributions

Size: px
Start display at page:

Download "Probability and Distributions"

Transcription

1 Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated that the statistical model of density function is determined by parameters, and that data is a realization of iid random variables, called a random sample. Once the statistical model is constructed, basic notion of random variables and their distributions can be readily applicable; thus, the solid understanding of probability theory in terms of calculus (as opposed to measure theory) is the key to a successful investigation of statistical model. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 1 / 44

2 Concept of random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X (ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote random variables by uppercase letters X, Y, Z, U, V,... from the end of the alphabet. In particular, a discrete random variable is a random variable that can take values on a finite set {a 1, a 2,..., a n } of real numbers (usually integers), or on a countably infinite set {a 1, a 2, a 3,...}. The statement such as X = a i is an event since {ω : X (ω) = a i } is a subset of the sample space Ω. Thus, we can consider the probability of the event {X = a i }, denoted by P(X = a i ). The function p(a i ) := P(X = a i ) over the possible values of X, say a 1, a 2,..., is called a frequency function, or a probability mass function. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 2 / 44

3 Continuous random variable. A continuous random variable is a random variable whose possible values are real values such as 78.6, 5.7, 10.24, and so on. Examples of continuous random variables include temperature, height, diameter of metal cylinder, etc. In what follows, a random variable always means a continuous random variable, unless it is particularly said to be discrete. The probability distribution of a random variable X specifies how its values are distributed over the real numbers. This is completely characterized by the cumulative distribution function (cdf). The cdf F (t) := P(X t). represents the probability that the random variable X is less than or equal to t. Then we often say that the random variable X is distributed as F. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 3 / 44

4 Probability distributions. It is often the case that the probability of the random variable X being in any particular range of values is given by the area under a curve over that range of values. This curve is called the probability density function (pdf) of the random variable X, denoted by f (x). Thus, the probability that a X b can be expressed as P(a X b) = b a f (x)dx. Furthermore, we can find the following relationship between cdf and pdf: F (t) = P(X t) = t f (x)dx. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 4 / 44

5 Joint distributions. When a pair (X, Y ) of random variables is considered, a joint density function f (x, y) is used to compute probabilities constructed from the random variables X and Y simultaneously. Given the joint density function f (x, y), the distribution for each of X and Y is called the marginal distribution. The marginal density functions of X and Y, denoted by f X (x) and f Y (y), are given respectively by f X (x) = f (x, y) dy and f Y (y) = f (x, y) dx. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 5 / 44

6 Conditional probability distributions. Suppose that two random variables X and Y has a joint density function f (x, y). If f X (x) > 0, then we can define the conditional density function f Y X (y x) given X = x by f Y X (y x) = f (x, y) f X (x). Similarly we can define the conditional density function f X Y (x y) given Y = y by f (x, y) f X Y (x y) = f Y (y) if f Y (y) > 0. Then, clearly we have the following relation f (x, y) = f Y X (y x)f X (x) = f X Y (x y)f Y (y). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 6 / 44

7 Independent random variables. If the joint density function f (x, y) for continuous random variables X and Y is expressed in the form f (x, y) = f X (x)f Y (y) for all x, y, then X and Y are said to be independent. Similarly we can construct a joint density function f (x 1,..., x n ) for a vector (X 1,..., X n ) of random variables. If it satisfies f (x 1,..., x n ) = f 1 (x 1 ) f n (x n ) for all x 1,..., x n, then X 1,..., X n are said to be independent. Furthermore, if X 1,..., X n share the common density function f then X 1,..., X n is called independent and identically distributed, or simply, iid for short. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 7 / 44

8 Expectation. Let X be a random variable, and let f is the pdf of X. Then the expectation (or expected value, or mean) of the random variable X is given by E[X ] = x f (x) dx. (1.1) We denote the expected value (1.1) by E(X ), µ, or µ X. For a function g, we define the expectation E[g(X )] of the function g(x ) of the random variable X by E[g(X )] = g(x) f (x) dx. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 8 / 44

9 Expectation, continued. One can think of the expectation E(X ) as an operation on a random variable X which returns the average value for X. It is also useful to observe that this is a linear operator satisfying the following properties: [ ] n n E a + b i X i = a + b i E[X i ] i=1 with random variables X 1,..., X n and constants a and b 1,..., b n. i=1 Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 9 / 44

10 Variance and covariance. The variance of a random variable X, denoted by Var(X ), is the expected value of the squared difference between the random variable and its expected value E(X ). Thus, it is given by Var(X ) := E[(X E(X )) 2 ] = E[X 2 ] (E[X ]) 2. The square-root Var(X ) is called the standard error (SE) (or standard deviation (SD)) of the random variable X. Now suppose that we have two random variables X and Y. Then the covariance of two random variables X and Y can be defined as Cov(X, Y ) := E((X µ x )(Y µ y )) = E(XY ) E(X ) E(Y ), where µ x = E(X ) and µ y = E(Y ). The covariance Cov(X, Y ) measures the strength of the dependence of the two random variables. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 10 / 44

11 Variance and covariance, continued. In contrast to the expectation, the variance is not a linear operator: The variance of ax + b with constants a and b is given by Var(aX + b) = a 2 Var(X ). For two random variables X and Y, we have Var(X + Y ) = Var(X ) + Var(Y ) + 2Cov(X, Y ). (1.2) However, if X and Y are independent, then Cov(X, Y ) = 0, and consequently Var(X + Y ) = Var(X ) + Var(Y ). In general, when we have a sequence of independent random variables X 1,..., X n, we have Var(X X n ) = Var(X 1 ) + + Var(X n ). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 11 / 44

12 Normal distributions. A normal distribution is represented by a family of distributions which have the same general shape, sometimes described as bell shaped. The normal distribution has the pdf f (x) := 1 σ 2π exp [ ] (x µ)2, (1.3) 2σ 2 which depends upon two parameters µ and σ 2. Here π = is the famous pi (the ratio of the circumference of a circle to its diameter), and exp(u) is the exponential function e u with the base e = of the natural logarithm. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 12 / 44

13 Normal distributions, continued. Much celebrated integrations are the following: 1 σ 2π 1 σ 2π 1 σ 2π e (x µ) 2 2σ 2 dx = 1, (1.4) xe (x µ) 2 2σ 2 dx = µ, (x µ) 2 e (x µ) 2 2σ 2 dx = σ 2. The integration (1.4) guarantees that the density function (1.3) always represents a probability density no matter what values the parameters µ and σ 2 are chosen. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 13 / 44

14 Parameters of normal distributions. We say that a random variable X is normally distributed with parameter (µ, σ 2 ) when X has the density function (1.3). The parameter µ, called mean (or, location parameter), provides the center of the density, and the density function f (x) is symmetric around µ. The parameter σ is a standard deviation (or, a scale parameter); small values of σ lead to high peaks but sharp drops. Larger values of σ lead to flatter densities. The shorthand notation X N(µ, σ 2 ) is often used to express that X is a normal random variable with parameter (µ, σ 2 ). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 14 / 44

15 Linear transform of normal random variable. One of the important properties of normal distribution is that if X is a normal random variable with parameter (µ, σ 2 ) [that is, the pdf of X is given by the density function (1.3)]. then Y = ax + b is also a normal random variable having parameter (aµ + b, (aσ) 2 ). In particular, X µ (1.5) σ becomes a normal random variable with parameter (0, 1), called the z-score. Then the normal density φ(x) with parameter (0, 1) becomes φ(x) := 1 2π e x2 2, which is known as the standard normal density. We define the cdf Φ by Φ(t) := t φ(x) dx. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 15 / 44

16 Gamma distribution. The gamma function, denoted by Γ(α), is defined as Γ(α) = Then the gamma density is defined as f (t) = 0 u α 1 e u du, α > 0. λα Γ(α) tα 1 e λt t 0 which depends on two parameters α > 0 and λ > 0. [A statistics textbook often uses a parameter β = 1 λ instead.] Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 16 / 44

17 Parameters of gamma/exponential distribution. We call the parameter α a shape parameter, because changing α changes the shape of the density. We call the parameter λ a rate parameter, because changing λ merely rescales the density without changing its shape. In particular, the gamma distribution with α = 1 becomes an exponential distribution (with parameter λ). The parameter β = 1/λ, if you choose, is called a scale parameter. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 17 / 44

18 Chi-square distribution. The gamma distribution f (x) = 1 2 n/2 Γ(n/2) x n/2 1 e x/2, x 0 with α = n 2 and λ = 1 is called the chi-square distribution with n 2 degrees of freedom. It plays a vital role in understanding another important distribution, called t-distribution. A chi-square random variable X has the mean E[X ] = n and the variance Var(X ) = 2n. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 18 / 44

19 Chi-square distribution of one degree of freedom. Let X be a standard normal random variable, and let Y = X 2 be the square of X. Then we can express the cdf F Y of Y as F Y (t) = P(X 2 t) = P( t X t) = Φ( t) Φ( t) in terms of the standard normal cdf Φ. By differentiating the cdf, we can obtain the pdf f Y as follows. f Y (t) = d dt F Y (t) = 1 2π t 1 2 e t 2 = 1 2 1/2 Γ(1/2) t 1/2 e t/2, where we use Γ(1/2) = π. Thus, the square X 2 of X is the chi-square random variable with 1 degree of freedom. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 19 / 44

20 Chi-square distribution with n df. Let X 1,..., X n be iid standard normal random variables. Since X 2 i s have the gamma distribution with parameter (1/2, 1/2), the sum Y = n i=1 X 2 i has the gamma distribution with parameter (n/2, 1/2). That is, the sum Y has the chi-square distribution with n degree of freedom. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 20 / 44

21 Student s t-distribution. Let X be a chi-square random variable with n degrees of freedom, and let Y be a standard normal random variable. Suppose that X and Y are independent. Then the distribution of the quotient Z = Y X /n is called the Student s t-distribution with n degrees of freedom. The pdf f Z (z) is given by f Z (z) = Γ((n + 1)/2) nπγ(n/2) (1 + z 2 /n) (n+1)/2, < z <. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 21 / 44

22 Sample statistics. Let X 1,..., X n be independent and identically distributed random variables ( iid random variables for short). Then the random variable n X = 1 n i=1 X i is called the sample mean, and the random variable S 2 = 1 n 1 is called the sample variance. n (X i X ) 2 i=1 Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 22 / 44

23 Sample statistics under normal assumption. Assume that X 1,..., X n are iid normal random variables with parameter (µ, σ 2 ). Then the sample statistics X and S 2 have the following properties: 1 E[ X ] = µ and Var( X ) = σ2 n. 2 The sample variance S 2 can be rewritten as ( n ) S 2 = 1 Xi 2 n n 1 X 2 i=1 ( n ) = 1 (X i µ) 2 n( n 1 X µ) 2. i=1 Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 23 / 44

24 Sample statistics under normal assumption, continued. In particular, we can obtain ( n ) E[S 2 ] = 1 E[(X i µ) 2 ] ne[( n 1 X µ) 2 ] i=1 ( n ) = 1 Var(X i ) nvar( n 1 X ) = σ 2. i=1 X has the normal distribution with parameter (µ, σ 2 /n), and X and S 2 are independent. Furthermore, W n 1 = (n 1)S 2 σ 2 = 1 σ 2 n (X i X ) 2 i=1 has the chi-square distribution with (n 1) degrees of freedom. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 24 / 44

25 t-statistic. Assuming iid normal random variables X 1,..., X n with parameter (µ, σ 2 ), we can construct a standard normal random variable Z 1 = X µ σ/ and a chi-square random variable W n n 1 = (n 1)S2. σ 2 Furthermore, the function of random sample ) T = X µ S/ n = ( X µ σ/ n (n 1)S 2 /σ 2 n 1 = Z 1 W n 1 n 1 has the t-distribution with (n 1) degrees of freedom, and it is called t-statistic. The statistic S = S 2 is called the sample standard deviation. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 25 / 44

26 Transformation of one random variable. Let X be a random variable with pdf f X (x), and let g(x) be a differentiable and strictly monotonic function (that is, either strictly increasing or strictly decreasing) on real line. Then the inverse function g 1 exists, and the random variable Y = g(x ) has the pdf f Y (y) = f X (g 1 (y)) d dy g 1 (y). (1.6) Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 26 / 44

27 Linear transform of random variable. Let X be a random variable with pdf f X (x). Then we can define the linear transform Y = ax + b with real values a and b. To compute the density function f Y (y) of the random variable Y, we can apply (1.6) to obtain ( ) y b 1 f Y (y) = f X a a. For example, suppose that X is a normal random variable with parameter (µ, σ 2 ). Then the density function f Y (y) for the linear transform Y = ax + b becomes ] 1 f Y (y) = [ ( a σ) 2π exp (y (aµ + b))2, 2(aσ) 2 which implies that Y is a normal random variable with parameter (aµ + b, (aσ) 2 ). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 27 / 44

28 Change of variables in double integral. For a rectangle A = {(x, y) : a 1 x b 1, a 2 y b 2 } on a plane, the integration of a function f (x, y) over A is formally written as A f (x, y) dx dy = b2 b1 a 2 a 1 f (x, y) dx dy (1.7) Suppose that a transformation (g 1 (x, y), g 2 (x, y)) is differentiable and has the inverse transformation (h 1 (u, v), h 2 (u, v)) satisfying { { x = h 1 (u, v) u = g 1 (x, y) if and only if. (1.8) y = h 2 (u, v) v = g 2 (x, y) Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 28 / 44

29 Change of variables in double integral, continued. Then we can state the change of variable in the double integral (1.7) as follows: (i) define the inverse image of A by h 1 (A) = g(a) = {(g 1 (x, y), g 2 (x, y)) : (x, y) A}, (ii) calculate the Jacobian J h (u, v) = det [ h u 1(u, v) h u 2(u, v) h ] v 1(u, v) h, v 2(u, v) (iii) finally, if it does not vanish for all x, y, then we can obtain f (x, y) dx dy = f (h 1 (u, v), h 2 (u, v)) J h (u, v) du dv. A g(a) Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 29 / 44

30 Transformation of two random variables. Suppose that (a) random variables X and Y has a joint density function f X,Y (x, y), (b) a transformation (g 1 (x, y), g 2 (x, y)) is differentiable, and (c) its inverse transformation (h 1 (u, v), h 2 (u, v)) satisfies (1.8). Then the random variables U = g 1 (X, Y ) and V = g 2 (X, Y ) has the joint density function f U,V (u, v) = f X,Y (h 1 (u, v), h 2 (u, v)) J h (u, v). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 30 / 44

31 Order statistics. Let X 1,..., X n be independent and identically distributed random variables (iid random variables) with the common cdf F (x) and the pdf f (x). Then we can define the random variable V = min(x 1,..., X n ) of the minimum of X 1,..., X n. To find the distribution of V, consider the survival function P(V > x) of V and calculate as follows. P(V > x) = P(X 1 > x) P(X n > x) = [1 F (x)] n. Thus, we can obtain the cdf F V (x) and the pdf f V (x) of V F V (x) = 1 [1 F (x)] n and f V (x) = d dx F V (x) = nf (x)[1 F (x)] n 1. In particular, if X 1,..., X n be independent exponential random variables with the common parameter λ, then the minimum min(x 1,..., X n ) has the density (nλ)e (nλ)x which is again the exponential density function with parameter (nλ). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 31 / 44

32 Order statistics,continued. Let X 1,..., X n be iid random variables with the common cdf F (x) and the pdf f (x). When we sort X 1,..., X n as X (1) < X (2) < < X (n), the random variable X (k) is called the k-th order statistic. Theorem The pdf of the k-th order statistic X (k) is given by f k (x) = n! (k 1)!(n k)! f (x)f k 1 (x)[1 F (x)] n k. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 32 / 44

33 Uniform and beta distribution. When X 1,..., X n be iid uniform random variables on [0, θ], the pdf of X (k) /θ becomes f k (x) = n! (k 1)!(n k)! x k 1 (1 x) n k, 0 x 1, which is known as the beta distribution with parameters α = k and β = n k + 1. Uniform on [0, θ] Beta(k, n f (x) = 1 θ, 0 x θ E[X ] = θ/2, Var(X ) = θ 2 /12 X (k) /θ Γ(n + 1) f (x) = Γ(k)Γ(n + 1 k) E[X ] = k n + 1, Var Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 33 / 44

34 Summary of discrete random variables. 1 X is the number of Bernoulli trials until the first success occurs. 2 X is the number of Bernoulli trials until the rth success occurs. If X 1,, X r are independent geometric random variables with parameter p, then X = r k=1 X k becomes a negative binomial random variable with parameter (r, p). 3 r = 1. 4 If X 1,, X n are independent Bernoulli trials, then X = n k=1 X k becomes a binomial random variable with (n, p). 5 If X 1 and X 2 are independent binomial random variables with (n 1, p) and (n 2, p) then so is X = X 1 + X 2 with (n 1 + n 2, p). 6 Binomial distribution converges to Poisson distribution by letting n and p 0 while λ = np. 7 If X 1 and X 2 are independent Poisson random variables with λ 1 and λ 2 then so is X = X 1 + X 2 with λ 1 + λ 2. Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 34 / 44

35 Summary of discrete random variables, continued. Bernoulli trial (a) P(X = 1) = p and P(X = 0) = q := 1 p E[X ] = p, Var(X ) = pq The probability p of success and the probability q := 1 p of failure Geometric(p) P(X = i) = q i 1 p, i = 1, 2,... E[X ] = 1 p, Var(X ) = q p 2 B ( P(X = i) = E[X ] = Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 35 / 44

36 Summary of continuous random variables. 1 If X 1 and X 2 are independent normal random variables with (µ 1, σ 2 1) and (µ 2, σ 2 2) then so is X = X 1 + X 2 with (µ 1 + µ 2, σ σ 2 2). 2 If X 1,, X m are iid normal random variables with parameter (µ, σ 2 ), then X = m ) 2 k=1 is a chi-square random variable with m df. 3 n = m 2 and λ = 1 2. ( Xk µ σ 4 If X 1,..., X n are independent exponential random variables with λ 1,, λ n then so is X = min{x 1,..., X n } with n i=1 λ i. 5 If X 1,, X n are iid exponential random variables with parameter λ, then X = n k=1 X k is a gamma random variable with (n, λ). 6 n = 1. 7 If X 1 and X 2 are independent gamma random variables with parameters (n 1, λ) and (n 2, λ) then so is X = X 1 + X 2 with (n 1 + n 2, λ). Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 36 / 44

37 Summary of continuous random variables, continued. (d Normal(µ, σ 2 ) Ex f (x) = 1 2πσ e (x µ)2 2σ 2, < x < E[X ] = µ, Var(X ) = σ 2 f (x) E[X ] = (a) (b) (c) (e) G Note #1 Mathematical Statistics/August 30, 2018 Probability and Distributions 37 / 44 λe

38 Exercises Note #1 Mathematical Statistics/August 30, 2018 Exercises 38 / 44

39 Problem Suppose that X has the pdf f (x) = (x + 2)/18, 2 x 4. Calculate E[X ], E[(X + 2) 3 ], and E[6X 2(X + 2) 3 ]. Note #1 Mathematical Statistics/August 30, 2018 Exercises 39 / 44

40 Problem Suppose that X has the pdf f (x) = (x + 2)/18, 2 x 4. Calculate E[X ], E[(X + 2) 3 ], and E[6X 2(X + 2) 3 ]. 4 [ ] (x + 2) 2 (x + 2) 3 4 E[X + 2] = dx = = 4; thus, E[X ] = [ ] E[(X + 2) 3 (x + 2) 4 (x + 2) 5 4 ] = dx = = 432/ E[6X 2(X + 2) 3 ] = 6E[X ] 2E[(X + 2) 3 ] = 804/5. Note #1 Mathematical Statistics/August 30, 2018 Exercises 39 / 44

41 Problem Let f (x) = 6x(1 x), 0 x 1, be a pdf for X. Find the mean and the variance. Note #1 Mathematical Statistics/August 30, 2018 Exercises 40 / 44

42 Problem Let f (x) = 6x(1 x), 0 x 1, be a pdf for X. Find the mean and the variance. 1 E[X ] = (6x 2 6x 3 )dx = [2x 3 3x ] 4 1 = 1/ [ 3x E[X 2 ] = (6x 3 6x 4 4 )dx = 0 2 6x ] 5 1 = 3/ Therefore, Var(X ) = 3/10 (1/2) 2 = 1/20. Note #1 Mathematical Statistics/August 30, 2018 Exercises 40 / 44

43 Problem 1 If X N(75, 100), calculate P(X 60) and P(70 X 100). 2 If X N(µ, σ 2 ), find b so that P( b < (X µ)/σ < b) = Note #1 Mathematical Statistics/August 30, 2018 Exercises 41 / 44

44 Problem 1 If X N(75, 100), calculate P(X 60) and P(70 X 100). 2 If X N(µ, σ 2 ), find b so that P( b < (X µ)/σ < b) = P(X 60) = Φ ( ) P(70 X 100) = Φ ( ) ( Φ ) b Note #1 Mathematical Statistics/August 30, 2018 Exercises 41 / 44

45 Problem Suppose that X and Y have the joint density f (x, y) = 2e x y, 0 < x < y <. Find the joint density for U = 2X and V = Y X. Note #1 Mathematical Statistics/August 30, 2018 Exercises 42 / 44

46 Problem Suppose that X and Y have the joint density f (x, y) = 2e x y, 0 < x < y <. Find the joint density for U = 2X and V = Y X. The transformation g 1 (x, y) = 2x and g 2 (x, y) = y x maps from {(x, y) : 0 < x < y < } to {(u, v) : 0 < u <, 0 < v < }. Then we can obtain the inverse transformation x = h 1 (u, v) = u/2 and y = h 2 (u, v) = u/2 + v, and calculate the Jacobian [ ] 1/2 0 J h (u, v) = det = 1/2. 1/2 1 Thus, the joint distribution is given by f UV (u, v) = f XY (u/2, u/2 + v)/2 = e u v if 0 < u < and 0 < v <. Note #1 Mathematical Statistics/August 30, 2018 Exercises 42 / 44

47 Problem Suppose that X and Y have the joint density f (x, y) = 8xy, 0 < x < y < 1. Find the joint density for U = X /Y and V = Y. Note #1 Mathematical Statistics/August 30, 2018 Exercises 43 / 44

48 Problem Suppose that X and Y have the joint density f (x, y) = 8xy, 0 < x < y < 1. Find the joint density for U = X /Y and V = Y. The transformation g 1 (x, y) = x/y and g 2 (x, y) = y maps from {(x, y) : 0 < x < y < 1} to {(u, v) : 0 < u < 1, 0 < v < 1}. Then we can obtain the inverse transformation x = h 1 (u, v) = uv and y = h 2 (u, v) = v, and calculate the Jacobian [ ] v u J h (u, v) = det = v. 0 1 Thus, the joint distribution is given by if 0 < u < 1 and 0 < v < 1. f UV (u, v) = f XY (uv, v) v = 8uv 3 Note #1 Mathematical Statistics/August 30, 2018 Exercises 43 / 44

49 Problem Let X (k), k = 1, 2, 3, 4, be the order statistics of a random sample of size 4 from the pdf f (x) = e x, 0 < x <. Find P(X (4) > 3). Note #1 Mathematical Statistics/August 30, 2018 Exercises 44 / 44

50 Problem Let X (k), k = 1, 2, 3, 4, be the order statistics of a random sample of size 4 from the pdf f (x) = e x, 0 < x <. Find P(X (4) > 3). P(X (4) 3) = 4 P(X i 3) = (1 e 3 ) 4. Thus, we obtain i=1 P(X (4) > 3) = 1 (1 e 3 ) 4. Note #1 Mathematical Statistics/August 30, 2018 Exercises 44 / 44

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Continuous Distributions

Continuous Distributions Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

STAT 3610: Review of Probability Distributions

STAT 3610: Review of Probability Distributions STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

3 Modeling Process Quality

3 Modeling Process Quality 3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

1 Probability theory. 2 Random variables and probability theory.

1 Probability theory. 2 Random variables and probability theory. Probability theory Here we summarize some of the probability theory we need. If this is totally unfamiliar to you, you should look at one of the sources given in the readings. In essence, for the major

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

15 Discrete Distributions

15 Discrete Distributions Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.

More information