Random Variables and Their Distributions
|
|
- Morgan Deirdre Rodgers
- 5 years ago
- Views:
Transcription
1 Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital letters: X, Y or Z and their values by small letters: x, y or z respectively. There are two types of r.vs: discrete and continuous. Random variables that can take a countable number of values are called discrete. Random variables that take values from an interval of real numbers are called continuous. Example 3. Discrete r.v., Brockwell and Davis (2002) Yearly results of the all-star baseball games over years , where X t = { if the National League won in year t, if the Americal League won in year t. In each of the realizations there is some probability p of X t = and probability p of X t =. Example 3.2 Continuous r.v. The r.vs in all the examples given in Chapter are continuous. 33
2 34 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS 3. One Dimensional Random Variables Definition 3.. If E is an experiment having sample space S, and X is a function that assigns a real number X(e) to every outcome e S, then X is called a random variable. Definition 3.2. Let X denote a r.v. and x its particular value from the whole range of all values of X, say R X. The probability of the event (X x) expressed as a function of x: F X (x) = P X (X x) (3.) is called the cumulative distribution function (c.d.f.) of the r.v. X. Properties of cumulative distribution functions 0 F X (x), < x < lim x F X (x) = lim x F X (x) = 0 The function is nondecreasing. That is, if x x 2 then F X (x ) F X (x 2 ). 3.. Discrete Random Variables Values of a discrete r.v. are elements of a countable set {x, x 2,...}. We associate a number p X (x i ) = P X (X = x i ) with each value x i, i =, 2,..., such that:. p X (x i ) 0 for all i 2. i= p X(x i ) = Note that F X (x i ) = P X (X x i ) = x x i p X (x) (3.2) p X (x i ) = F X (x i ) F X (x i ) (3.3) The function p X is called the probability mass function of the random variable X, and the collection of pairs {(x i, p X (x i )), i =, 2,...} (3.4)
3 3.. ONE DIMENSIONAL RANDOM VARIABLES 35 is called the probability distribution of X. The distribution is usually presented in either tabular, graphical or mathematical form. Examples of known p.m.fs Ex. 3.3 The Binomial Distribution (X denotes k successes in n independent trials). The p.m.f. of a binomially distributed r.v. X with parameters n and p is ( ) n P(X = k) = p k ( p) n k, k = 0,, 2,..., n, k where n is a positive integer and 0 p. Ex. 3.4 The Uniform Distribution The p.m.f. of a r.v. X uniformly distributed on {, 2,..., n} is P(X = k) =, k =, 2,..., n, n where n is a positive integer. Ex. 3.5 The Poisson Distribution (X denotes a number of outcomes in a period of time). The p.m.f. of a r.v. X having a Poisson distribution with parameter λ > 0 is P(X = k) = λk k! e λ. Take as an example a r.v. X having the Binomial distribution: X Bin(8, 0.4). That is n = 8 and the probability of success p = 0.4. The distribution, shown in a mathematical, tabular and graphical way and a graph of the c.d.f. of the variable X follow. Mathematical form: Tabular form: {(k, P(X = k) = n C k p k ( p) n k ), k = 0,, 2,..., 8} (3.5) k P(X = k) P(X k)
4 36 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS x x Figure 3.: Graphical representation of the mass function and the cumulative distribution function for X Bin(8, 0.4) Other important discrete distributions are: Bernoulli(p) Geometric(p) Hypogeomeric(n, M, N) 3..2 Continuous Random Variables Values of a continuous r.v. are elements of an uncountable set, for example a real interval. A c.d.f. of a continuous r.v. is a continuous, nondecreasing, differentiable function. An interesting difference from a discrete r.v. is that for δ > 0 P X (X = x) = lim δ 0 (F X (x + δ) F X (x)) = 0. We define the probability density function (p.d.f.) of a continuous r.v. as: Hence f X (x) = d dx F X(x) (3.6) F X (x) = x f X (t)dt (3.7) Similarly to the properties of the probability distribution of a discrete r.v. we have the following properties of the density function:. f X (x) 0 for all x R X 2. R X f X (x)dx =
5 Your text 3.. ONE DIMENSIONAL RANDOM VARIABLES 37 F(x) Density 0.0 cdf a b c 0.0 x Figure 3.2: N(4.5, 2) Distribution Function and Cumulative Distribution Function for Probability of an event (X A), where A is an interval (, a), is expressed as an integral P X ( < X < a) = or for a bounded interval (b, c) P X (b < X < c) = c b a f X (x)dx = F X (a) (3.8) f X (x)dx = F X (c) F X (b). (3.9) Examples of continuous r.vs Ex. 3.6 The normal distribution N(µ, σ 2 ) The density function is given by f X (x) = σ e (x µ) 2 2σ 2. (3.0) 2π There are two parameters which tell us about the position and the shape of the density curve: the expected value µ and the standard deviation σ. Ex. 3.7 The uniform distribution U(a, b) The p.d.f. of a r.v. X uniformly distributed on [a, b] is given by, if a x b, f X (x) = b a 0, otherwise.
6 38 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS pdf cdf x x Figure 3.3: Distribution density function and cumulative distribution function for Exp() Ex. 3.8 The exponential distribution Exp(λ) The p.d.f. of an exponentially distributed r.v. X with parameter λ > 0 is { 0, if x < 0, f X (x) = λe λx, if x 0. The corresponding c.d.f. is { 0, if x < 0, F X (x) = e λx, if x 0. Ex. 3.9 The gamma distribution Gamma(α, λ) with parameters representing shape (α) and scale (λ). The p.d.f. of a gamma distributed r.v. X is { 0, if x < 0, f X (x) = x α λ α e λx /Γ(α), if x 0, where the α > 0, λ > 0 and Γ is the gamma function defined by Γ(n) = 0 x n e x dx, for n > 0. A recursive relationship that may be easily shown integrating the above equation by parts is Γ(n) = (n )Γ(n ). If n is a positive integer, then since Γ() =. Γ(n) = (n )!
7 3.. ONE DIMENSIONAL RANDOM VARIABLES 39 Other important continuous distributions are The chi-squared distribution with ν degrees of freedom, χ 2 ν The F distribution with ν, ν 2 degrees of freedom, F ν,ν 2. Student t-distribution with ν degrees of freedom, t ν. These distributions are functions of normally distributed r.vs. Note that all the distributions depend on some parameters, like p, λ, µ, σ or other. These values are usually unknown, and so their estimation is one of the important problems in statistical analyses Expectation The expectation of a function g of a r.v. X is defined by E(g(X)) = g(x)f(x)dx, for a continuous r.v., g(x j )p(x j ) for a discrete r.v., j=0 (3.) and g is any function such that E g(x) <. Two important special cases of g are: Mean E(X), also denoted by E X, when g(x) = X, Variance E(X E X) 2, when g(x) = (X E X) 2. The following relation is very useful while calculating the variance E(X E X) 2 = E X 2 (E X) 2. (3.2) Example 3.0 Let X be a r.v. such that sin x, for x [0, π], f(x) = 2 0 otherwise. Then the expectation and variance are following
8 40 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS Expectation Variance E X = 2 π 0 0 x sin xdx = π 2. var(x) = EX 2 (E X) 2 = π ( π 2 x 2 sin xdx 2 2) = π2 4. Example 3. The Normal distribution, X N(µ, σ 2 ) We will show that the parameter µ is the expectation of X and the parameter σ 2 is the variance of X. First we show that E(X µ) = 0. Hence E X = µ. E(X µ) = (x µ)f(x)dx = (x µ) σ e (x µ) 2 2σ 2 dx 2π = σ 2 f (x)dx = 0. Similar arguments give E(X µ) 2 = (x µ) 2 f(x)dx = σ 2 (x µ)f (x)dx = σ 2. A useful linearity property of expectation is E[ag(X) + bh(y ) + c] = a E[g(X)] + b E[h(Y )] + c, (3.3) for any real constants a, b and c and for any functions g and h of r.vs X and Y whose expectations exist.
9 3.2. TWO DIMENSIONAL RANDOM VARIABLES Two Dimensional Random Variables Definition 3.3. Let S be a sample space associated with an experiment E, and X, X 2 be functions, each assigning a real number X (e), X 2 (e) to every outcome e E. Then the pair X = (X, X 2 ) is called a two-dimensional random variable. The range space of the two-dimensional random variable is R X = {(x, x 2 ) : x R X, x 2 R X2 } R 2. Definition 3.4. The cumulative distribution function of a two-dimensional r.v. X = (X, X 2 ) is F X (x, x 2 ) = P(X x, X 2 x 2 ) (3.4) 3.2. Discrete Two-Dimensional Random Variables If all values of X = (X, X 2 ) are countable, i.e., the values are in the range R X = {(x i, x 2j ), i =,...,n, j =,...,m} then the variable is discrete. The c.d.f. of a discrete r.v. X = (X, X 2 ) is defined as F X (x, x 2 ) = p(x i, x 2j ) (3.5) x i x x 2j x 2 where p(x i, x 2j ) denotes the joint probability mass function and p(x i, x 2j ) = P(X = x i, X 2 = x 2j ). As in the univariate case, the joint p.m.f. satisfies the following conditions.. p(x i, x 2j ) 0, for all i, j 2. all j all i p(x i, x 2j ) = Example 3.2 Tossing two fair dice. Consider an experiment of tossing two fair dice and noting the outcome on each die. The whole sample space consists of 36 elements, i.e., S = {(i, j) : i, j =,..., 6}.
10 42 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS Now, with each of these 36 elements associate values of two variables, X and X 2, such that X = sum of the outcomes on the two dice, X 2 = difference of the outcomes on the two dice. Then the bivariate r.v. X = (X, X 2 ) has the following joint probability mas function. x x Marginal p.m.fs Theorem 3.. Let X = (X, X 2 ) be a discrete bivariate random variable with joint p.m.f. p(x, x 2 ). Then the marginal p.m.fs of X and X 2, p X and p X2, are given respectively by p X (x ) = P(X = x ) = R X2 p(x, x 2 ) and p X2 (x 2 ) = P(X 2 = x 2 ) = R X p(x, x 2 ). Example 3.2 cont. The marginal distributions of the variables X and X 2 are following. x P(X = x ) x P(X 2 = x 2 )
11 3.2. TWO DIMENSIONAL RANDOM VARIABLES 43 Expectations of functions of bivariate random variables are calculated the same way as the univariate r.vs. Let g(x, x 2 ) be a real valued function defined on R X. Then g(x) = g(x, X 2 ) is a r.v. and its expectation is E[g(X)] = R X g(x, x 2 )p(x, x 2 ). Example 3.2 cont. For g(x, X 2 ) = X X 2 we obtain E[g(X)] = = Continuous Two-Dimensional Random Variables If the values of X = (X, X 2 ) are elements of an uncountable set in the Euclidean plane, then the variable is continuous. For example the values might be in the range R X = {(x, x 2 ) : a x b, c x 2 d} for some real a, b, c, d. The c.d.f. of a continuous r.v. X = (X, X 2 ) is defined as F X (x, x 2 ) = x2 x where f(x, x 2 ) is the probability density function such that. f(x, x 2 ) 0 for all (x, x 2 ) R 2 2. f(x, x 2 )dx dx 2 =. The equation (3.6) implies that Also P(a X b, c X 2 d) = f(t, t 2 )dt dt 2, (3.6) 2 F(x, x 2 ) x x 2 = f(x, x 2 ). (3.7) d b c a f(x, x 2 )dx dx 2. (3.8)
12 44 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS X_2 R (X,X2) 0 X_ Figure 3.4: Domain of the p.d.f. (3.2) The marginal p.d.fs of X and X 2 are defined similarly as in the discrete case, here using integrals. f X (x ) = f(x, x 2 )dx 2, for < x <, (3.9) f X2 (x 2 ) = f(x, x 2 )dx, for < x 2 <. (3.20) Example 3.3 Calculate P[X A], where A = {(x, x 2 ) : x + x 2 } and the joint p.d.f. of X = (X, X 2 ) is defined by { 6x x 2 2 for 0 < x <, 0 < x 2 <, f X (x, x 2 ) = 0 otherwise. (3.2) The probability is a double integral of the p.d.f. over the region A. The region is however limited by the domain in which the p.d.f. is positive, see Figure 3.4.
13 3.2. TWO DIMENSIONAL RANDOM VARIABLES 45 We can write Hence, the probability is P(X A) = A = {(x, x 2 ) : x + x 2, 0 < x <, 0 < x 2 < } = {(x, x 2 ) : x x 2, 0 < x <, 0 < x 2 < } = {(x, x 2 ) : x 2 < x <, 0 < x 2 < }. A f(x, x 2 )dx dx 2 = 0 x 2 6x x 2 2 dx dx 2 = 0.9 Also, using formulae ((3.9)) and ((3.20)) we can calculate marginal p.d.fs. f X (x ) = f X2 (x 2 ) = 0 0 6x x 2 2 dx 2 = 2x x = 2x, 6x x 2 2dx = 3x 2 x 2 2 0= 3x 2 2. These functions allow us to calculate probabilities involving only one variable. For example ( P 4 < X < ) 2 = 2x dx = Similarly to the case of a univariate r.v. the following linear property for the expectation holds. 4 E[ag(X) + bh(x) + c] = a E[g(X)] + b E[h(X)] + c, (3.22) where a, b and c are constants and g and h are some functions of the bivariate r.v. X = (X, X 2 ) Conditional Distributions and Independence Definition 3.5. Let X = (X, X 2 ) denote a discrete bivariate r.v. with joint p.m.f. p X (x, x 2 ) and marginal p.m.fs p X (x ) and p X2 (x 2 ). For any x such that p X (x ) > 0, the conditional p.m.f. of X 2 given that X = x is the function of x 2 defined by p X2 (x 2 x ) = p X(x, x 2 ) p X (x ). (3.23) Analogously, we define the conditional p.m.f. of X given X 2 = x 2 p X (x x 2 ) = p X(x, x 2 ) p X2 (x 2 ). (3.24)
14 46 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS It is easy to check that these functions are indeed p.d.fs. For example, for (3.23) we have R X2 p X (x, x 2 ) p X (x, x 2 ) p X2 (x 2 x ) = = p X (x ) R X2 R X2 p X (x ) = p X (x ) p X (x ) =. Example 3.2 cont. The conditional p.m.f. of X 2 given, for example, X = 5, is x p X2 (x 2 5) Analogously to the conditional distribution for discrete r.vs, we define the conditional distribution for continuous r.vs. Definition 3.6. Let X = (X, X 2 ) denote a continuous bivariate r.v. with joint p.d.f. f X (x, x 2 ) and marginal p.d.fs f X (x ) and f X2 (x 2 ). For any x such that f X (x ) > 0, the conditional p.d.f. of X 2 given that X = x is the function of x 2 defined by f X2 (x 2 x ) = f X(x, x 2 ) f X (x ). (3.25) Analogously, we define the conditional p.d.f. of X given X 2 = x 2 f X (x x 2 ) = f X(x, x 2 ) f X2 (x 2 ). (3.26) Here too, it is easy to verify that these functions are p.d.fs. For example, for the function (3.25) we have f X (x, x 2 ) f X2 (x 2 x )dx 2 = R X2 R X2 f X (x ) dx 2 R X2 f X (x, x 2 )dx 2 = f X (x ) = f X (x ) f X (x ) =.
15 3.2. TWO DIMENSIONAL RANDOM VARIABLES 47 Example 3.3 cont. The conditional p.d.fs are and f X (x x 2 ) = f X(x, x 2 ) f X2 (x 2 ) f X2 (x 2 x ) = f X(x, x 2 ) f X (x ) = 6x x 2 2 3x 2 2 = 2x = 6x x 2 2 2x = 3x 2 2. The conditional p.d.fs make it possible to calculate conditional expectations. The conditional expected value of a function g(x 2 ) given that X = x is defined by g(x 2 )p X2 (x 2 x ) for a discrete r.v., R X2 E[g(X 2 ) x ] = (3.27) g(x 2 )f X2 (x 2 x )dx 2 for a continuous r.v.. R X2 Example 3.3 cont. Equation ((3.27)) allows to calculate conditional mean and variance of a r.v. Here we have and µ X2 x = E(X 2 x ) = σ 2 X 2 x = var(x 2 x ) = E(X 2 2 x ) [E(X 2 x )] 2 = 0 x 2 3x 2 2 dx 2 = 3 4, 0 x 2 23x 2 2dx 2 ( ) 2 3 = Definition 3.7. Let X = (X, X 2 ) denote a continuous bivariate r.v. with joint p.d.f. f X (x, x 2 ) and marginal p.d.fs f X (x ) and f X2 (x 2 ). Then X and X 2 are called independent random variables if, for every x R X and x 2 R X2 f X (x, x 2 ) = f X (x )f X2 (x 2 ). (3.28)
16 48 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS If X and X 2 are independent, then the conditional p.d.f. of X 2 given X = x is f X2 (x 2 x ) = f X(x, x 2 ) f X (x ) = f X (x )f X2 (x 2 ) f X (x ) = f X2 (x 2 ) regardless of the value of x. Analogous property holds for the conditional p.d.f. of X given X 2 = x 2. Example 3.3 cont. It is easy to notice that f X (x, x 2 ) = 6x x 2 2 = 2x 3x 2 2 = f X (x )f X2 (x 2 ). So, the variables X and X 2 are independent. In fact, two r.vs are independent if and only if there exist functions g(x ) and h(x 2 ) such that for every x R X and x 2 R X2, f X (x, x 2 ) = g(x )h(x 2 ). Theorem 3.2. Let X and X 2 be independent random variables. Then. For any A R and B R P(X A, X 2 B) = P(X A)P(X 2 B), that is, the events {X A} and {X 2 B} are independent events. 2. For g(x ), a function of X only, and for h(x 2 ), a function of X 2 only, we have E[g(X )h(x 2 )] = E[g(X )] E[h(X 2 )] Covariance and Correlation Covariance and correlation are two measures of the strength of a relationship between two r.vs.
17 3.2. TWO DIMENSIONAL RANDOM VARIABLES 49 We will use the following notation. E X = µ X E X 2 = µ X2 var(x ) = σ 2 X var(x 2 ) = σ 2 X 2 Also, we assume that σ 2 X and σ 2 X 2 are finite positive values. A simplified notation µ, µ 2, σ 2, σ2 2 notation refers to. will be used when it is clear which r.vs the Definition 3.8. The covariance of X and X 2 is defined by cov(x, X 2 ) = E[(X µ X )(X 2 µ X2 )]. (3.29) Definition 3.9. The correlation of X and X 2 is defined by ρ (X,X 2 ) = corr(x, X 2 ) = cov(x, X 2 ) σ X σ X2. (3.30) Properties of covariance For any r.vs X and X 2, cov(x, X 2 ) = E(X X 2 ) µ X µ X2. If r.vs X and X 2 are independent then cov(x, X 2 ) = 0. Then also ρ (X,X 2 ) = 0. For any r.vs X and X 2 and any constants a and b, var(ax + bx 2 ) = a 2 var(x ) + b 2 var(x 2 ) + 2ab cov(x, X 2 ).
18 50 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS Properties of correlation For any r.vs X and X 2 ρ (X,X 2 ), ρ (X,X 2 ) = iff there exist numbers a 0 and b such that P(X 2 = ax + b) =. If ρ (X,X 2 ) = then a > 0, and if ρ (X,X 2 ) = then a < Bivariate Normal Distribution Here, we use matrix notation. A bivariate r.v. is treated as a random vector ( ) X X =. Let X = (X, X 2 ) T be a bivariate random vector with expectation ( ) ( ) X µ µ = E X = E =. X 2 X 2 µ 2 and the variance-covariance matrix ( var(x ) cov(x V =, X 2 ) cov(x 2, X ) var(x 2 ) ) ( σ 2 = ρσ σ 2 ρσ σ 2 σ2 2 ). Then the joint p.d.f. of the vector r.v. X is given by f X (x) = where x = (x, x 2 ) T. 2π det(v ) exp { 2 (x µ)t V (x µ) }, (3.3) The determinant of V is ( σ 2 det V = det ρσ σ 2 ρσ σ 2 σ2 2 ) = ( ρ 2 )σσ
19 3.3. MULTIVARIATE NORMAL DISTRIBUTION 5 Hence, the inverse of V is V = ( σ2 2 ρσ σ 2 det V ρσ σ 2 σ 2 ) = ( ρ 2 σ 2 ρσ 2 ρσ σ2 σ2 2 ). Then the exponent in formula (3.3) can be written as 2 (x µ)t V (x µ) = ( = 2( ρ 2 ) (x µ, x 2 µ 2 ) ( (x µ ) 2 = 2( ρ 2 ) σ 2 σ 2 ρσ σ2 ρσ σ2 σ2 2 2ρ (x µ )(x 2 µ 2 ) σ σ 2 + (x 2 µ 2 ) 2 σ 2 2 So, the joint p.d.f. of the 2-dimensional vector r.v. X is f X (x) = 2πσ σ 2 ( ρ2 ) { ( (x µ ) 2 exp 2( ρ 2 ) σ 2 ) ( ) x µ x 2 µ ). 2ρ (x µ )(x 2 µ 2 ) + (x )} 2 µ 2 ) 2. σ σ 2 σ2 2 (3.32) Remark 3.. Note that when ρ = 0 the joint p.d.f. (3.32) simplifies to { f X (x) = exp ( (x µ ) 2 + (x )} 2 µ 2 ) 2, 2πσ σ 2 2 σ 2 σ2 2 which can be written as a product of the marginal distributions of X and X 2. Hence, if X = (X, X 2 ) T has a bivariate normal distribution and ρ = 0 then the variables X and X 2 are independent. 3.3 Multivariate Normal Distribution Denote by X = (X,...,X n ) T an n-dimensional random vector whose each component is a random variable. Then all the definitions given for bivariate r.vs extend to the multivariate r.vs. X has a multivariate normal distribution if its p.d.f. can be written as in (3.3), where the mean is µ = (µ,..., µ n ) T,
20 52 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS Figure 3.5: Bivariate Normal pdf
21 3.3. MULTIVARIATE NORMAL DISTRIBUTION 53 and the variance-covariance matrix has the form var(x ) cov(x, X 2 )... cov(x, X n ) cov(x 2, X ) var(x 2 )... cov(x 2, X n ) V = cov(x n, X ) cov(x n, X 2 )... var(x n ) Remark 3.2. If X N n (µ, V ), B is an m n matrix, and a is a real m vector, then the random vector is also multivariate normal with and the variance-covariance matrix, Y = a + BX E(Y ) = a + B E(X) = a + Bµ, V Y = BV B T. Remark 3.3. Taking B = b T, where b is an n dimensional vector and a = 0 we obtain Y = b T X = b X b n X n, and Y N(b T µ, b T V b). Two important properties of multivariate normal random vectors are given in the following proposition. Proposition 3.. Suppose that the normal r.v. is partitioned into two subvectors of dimensions n and n 2 ( ) X () X =, X (2)
22 54 CHAPTER 3. RANDOM VARIABLES AND THEIR DISTRIBUTIONS and, correspondingly, the mean vector and the variance-covariance matrix are partitioned as ( ) ( ) µ () V V µ = µ (2) V = 2, V 2 V 22 where Then µ (i) = E(X (i) ) and V ij = cov(x (i), X (j) ).. X () and X (2) are independent iff the (n n 2 ) dimensional covariance matrix V 2 is a zero matrix. 2. The conditional distribution of X () given X (2) = x (2) is N(µ () + V 2 V 22 (x(2) µ (2) ), V V 2 V 22 V 2). For the bivariate normal r.v. we obtain and E(X X 2 = x 2 ) = µ + ρσ σ 2 (x 2 µ 2 ) (3.33) var(x X 2 = x 2 ) = σ 2 ( ρ2 ). (3.34) Definition 3.0. X t is a Gaussian time series if all its joint distributions are multivariate normal, i.e., if for any collection of integers i,...,i n, the random vector (X i,...,x in ) T has a multivariate normal distribution. For proofs of the results given in this chapter see Castella and Berger (990) and Brockwell and Davis (2002) or other textbooks on probability and statistics.
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationQuick Tour of Basic Probability Theory and Linear Algebra
Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationLecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016
Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationContinuous random variables
Continuous random variables Can take on an uncountably infinite number of values Any value within an interval over which the variable is definied has some probability of occuring This is different from
More informationReview of Statistics I
Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationSTT 441 Final Exam Fall 2013
STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationProbability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27
Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More information1.12 Multivariate Random Variables
112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationThe mean, variance and covariance. (Chs 3.4.1, 3.4.2)
4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationIntroduction to Probability. Ariel Yadin. Lecture 10. Proposition Let X be a discrete random variable, with range R and density f X.
Introduction to Probability Ariel Yadin Lecture 1 1. Expectation - Discrete Case Proposition 1.1. Let X be a discrete random variable, with range R and density f X. Then, *** Jan. 3 *** Expectation for
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationChapter 2. Continuous random variables
Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationExpectation and Variance
Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationTwo hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45
Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationECON Fundamentals of Probability
ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationChapter 3. Chapter 3 sections
sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions
More informationmatrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2
Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More information