Chapter 12: Bivariate & Conditional Distributions
|
|
- Dina Nicholson
- 5 years ago
- Views:
Transcription
1 Chapter 12: Bivariate & Conditional Distributions James B. Ramsey March 2007 James B. Ramsey () Chapter 12 26/07 1 / 26
2 Introduction Key relationships between joint, conditional, and marginal distributions. James B. Ramsey () Chapter 12 26/07 2 / 26
3 Introduction Key relationships between joint, conditional, and marginal distributions. Joint Density: James B. Ramsey () Chapter 12 26/07 2 / 26
4 Introduction Key relationships between joint, conditional, and marginal distributions. Joint Density: f j (X 1, X 2 ) = f 2j1 (X 2 jx 1 )f 1 (X 1 ) f i (X i ) = = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Z f j (X 1, X 2 )dx j6=i James B. Ramsey () Chapter 12 26/07 2 / 26
5 Introduction Key relationships between joint, conditional, and marginal distributions. Joint Density: f i (X i ) = Prob. Distributions: f j (X 1, X 2 ) = f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Z f j (X 1, X 2 )dx j6=i James B. Ramsey () Chapter 12 26/07 2 / 26
6 Introduction Key relationships between joint, conditional, and marginal distributions. Joint Density: f i (X i ) = Prob. Distributions: f j (X 1, X 2 ) = f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Z f j (X 1, X 2 )dx j6=i Pr(X 1 x 0 ) = = Z x0 Z x0 Z [ f 1 (X 1 )dx 1 f j (X 1, X 2 )dx 2 ]dx 1 James B. Ramsey () Chapter 12 26/07 2 / 26
7 Cond. Prob. Distributions: Pr(X 1 x 0 jy = y 0 ) = R x 0 f 1j2(X 1 jy = y 0 )dx 1 James B. Ramsey () Chapter 12 26/07 3 / 26
8 Cond. Prob. Distributions: Pr(X 1 x 0 jy = y 0 ) = R x 0 f 1j2(X 1 jy = y 0 )dx 1 Joint Probs: pr(x 1 x 1,0, X 2 x 2,0 ) = R x 1,0 R x2,0 f j (X 1, X 2 )dx 1 dx 2 James B. Ramsey () Chapter 12 26/07 3 / 26
9 Cond. Prob. Distributions: Pr(X 1 x 0 jy = y 0 ) = R x 0 f 1j2(X 1 jy = y 0 )dx 1 Joint Probs: pr(x 1 x 1,0, X 2 x 2,0 ) = R x 1,0 f 1 (X 1 ) = R f j (X 1, X 2 )dx 2 = R f 1j2(X 1 jx 2 )f 2 (X 2 )dx 2 R x2,0 f j (X 1, X 2 )dx 1 dx 2 James B. Ramsey () Chapter 12 26/07 3 / 26
10 Cond. Prob. Distributions: Pr(X 1 x 0 jy = y 0 ) = R x 0 f 1j2(X 1 jy = y 0 )dx 1 Joint Probs: pr(x 1 x 1,0, X 2 x 2,0 ) = R x 1,0 R x2,0 f j (X 1, X 2 )dx 1 dx 2 f 1 (X 1 ) = R f j (X 1, X 2 )dx 2 = R f 1j2(X 1 jx 2 )f 2 (X 2 )dx 2 Or the marginal distn. is the weighted sum of the conditional distns. James B. Ramsey () Chapter 12 26/07 3 / 26
11 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) James B. Ramsey () Chapter 12 26/07 4 / 26
12 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; James B. Ramsey () Chapter 12 26/07 4 / 26
13 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; James B. Ramsey () Chapter 12 26/07 4 / 26
14 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; Individual consumption/income; James B. Ramsey () Chapter 12 26/07 4 / 26
15 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; Individual consumption/income; Individual height & weight; James B. Ramsey () Chapter 12 26/07 4 / 26
16 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; Individual consumption/income; Individual height & weight; Breakdown of two components of a machine James B. Ramsey () Chapter 12 26/07 4 / 26
17 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; Individual consumption/income; Individual height & weight; Breakdown of two components of a machine Compare I.Q.& height, James B. Ramsey () Chapter 12 26/07 4 / 26
18 Association is not Causality Follows clearly from fact that: f j (X 1, X 2 ) =f 2j1 (X 2 jx 1 )f 1 (X 1 ) = f 1j2 (X 1 jx 2 )f 2 (X 2 ) Sources of "association": measuring a at & a curved plate; Visitors to a real estate agent; buyers and sellers; Individual consumption/income; Individual height & weight; Breakdown of two components of a machine Compare I.Q.& height, Health & wearing of a top hat. James B. Ramsey () Chapter 12 26/07 4 / 26
19 Buyers & Renters Entering a Real Estate O ce Arrivals: Distn. is given by: e λ λ N N! ; Recall conditions for Poisson distn. James B. Ramsey () Chapter 12 26/07 5 / 26
20 Buyers & Renters Entering a Real Estate O ce Arrivals: Distn. is given by: e λ λ N N! ; Recall conditions for Poisson distn. Given N arrivals the distn. of Buyers is Binomial: James B. Ramsey () Chapter 12 26/07 5 / 26
21 Buyers & Renters Entering a Real Estate O ce Arrivals: Distn. is given by: e λ λ N N! ; Recall conditions for Poisson distn. Given N arrivals the distn. of Buyers is Binomial: N π B (1 π) N B ; N = B + R B James B. Ramsey () Chapter 12 26/07 5 / 26
22 Replace N by R + B to obtain the distn.: James B. Ramsey () Chapter 12 26/07 6 / 26
23 Replace N by R + B to obtain the distn.: e λ λ R +B (R + B)! [ = e λ λ R +B R!B! R + B π B (1 B π B (1 π) R ] π) R James B. Ramsey () Chapter 12 26/07 6 / 26
24 Replace N by R + B to obtain the distn.: e λ λ R +B (R + B)! [ = e λ λ R +B R!B! R + B π B (1 Is a two parameter distribution, λ, π; B π B (1 π) R ] π) R James B. Ramsey () Chapter 12 26/07 6 / 26
25 Replace N by R + B to obtain the distn.: e λ λ R +B (R + B)! [ = e λ λ R +B R!B! R + B π B (1 Is a two parameter distribution, λ, π; B π B (1 π) R ] π) R λ = mean arrivals per hour; π = proportion of Buyers. James B. Ramsey () Chapter 12 26/07 6 / 26
26 Derivation of the Bivariate Normal Distn. Recall the bivariate Gaussian distn. from a pair of independent Gaussian distns. James B. Ramsey () Chapter 12 26/07 7 / 26
27 Derivation of the Bivariate Normal Distn. Recall the bivariate Gaussian distn. from a pair of independent Gaussian distns. φ(x, Y ) = φ(x )φ(y ) = expf 1 2 ( X η x σ x ) 2 g expf 1 2 p ( Y η y σ y ) 2 g p 2πσx 2πσy James B. Ramsey () Chapter 12 26/07 7 / 26
28 Let η x, η y both equal zero, σ x =σ y = 1 in order to simplify the algebra; James B. Ramsey () Chapter 12 26/07 8 / 26
29 Let η x, η y both equal zero, σ x =σ y = 1 in order to simplify the algebra; φ(x, Y ) = expf 1 2 X g2 p 2π expf 1 2 Y 2 g p 2π = expf 1 2 [X 2 + Y 2 ]g 2π James B. Ramsey () Chapter 12 26/07 8 / 26
30 But if X, Y are associated, we speculate from the calculation of "r" that for some parameter ρ the quadratic above contains a term like: James B. Ramsey () Chapter 12 26/07 9 / 26
31 But if X, Y are associated, we speculate from the calculation of "r" that for some parameter ρ the quadratic above contains a term like: expf 1 2 [X 2 + Y 2 2ρXY ]g James B. Ramsey () Chapter 12 26/07 9 / 26
32 But if X, Y are associated, we speculate from the calculation of "r" that for some parameter ρ the quadratic above contains a term like: expf 1 2 [X 2 + Y 2 2ρXY ]g If correct, integration to 1 implies that: James B. Ramsey () Chapter 12 26/07 9 / 26
33 But if X, Y are associated, we speculate from the calculation of "r" that for some parameter ρ the quadratic above contains a term like: expf 1 2 [X 2 + Y 2 2ρXY ]g If correct, integration to 1 implies that: φ(x, Y ) = expf 1 2(1 ρ 2 ) [X 2 + Y 2 2ρXY ]g 2π p (1 ρ 2 ) James B. Ramsey () Chapter 12 26/07 9 / 26
34 And in general for non-zero means and non-unitary variances, one has: φ(x, Y ) = expf 1 2(1 ρ 2 ) [( X η x σ x ) 2 + ( Y η y σ y ) 2 2π p (1 ρ 2 )σ x σ y 2ρ( X η x σ x )( Y η y σ y )]g 2π p (1 ρ 2 )σ x σ y James B. Ramsey () Chapter 12 26/07 10 / 26
35 And in general for non-zero means and non-unitary variances, one has: φ(x, Y ) = expf 1 2(1 ρ 2 ) [( X η x σ x ) 2 + ( Y η y σ y ) 2 2π p (1 ρ 2 )σ x σ y 2ρ( X η x σ x )( Y η y σ y )]g 2π p (1 ρ 2 )σ x σ y If ρ equals zero, get the joint distn. of a pair of independent variables James B. Ramsey () Chapter 12 26/07 10 / 26
36 And in general for non-zero means and non-unitary variances, one has: φ(x, Y ) = expf 1 2(1 ρ 2 ) [( X η x σ x ) 2 + ( Y η y σ y ) 2 2π p (1 ρ 2 )σ x σ y 2ρ( X η x σ x )( Y η y σ y )]g 2π p (1 ρ 2 )σ x σ y If ρ equals zero, get the joint distn. of a pair of independent variables The σ x, σ y in the denominator results from transforming from standardized to non-standardized variables, e.g.u, V de ned by the transformations: James B. Ramsey () Chapter 12 26/07 10 / 26
37 U= X η x σ x implies: du = 1 σ x dx ; James B. Ramsey () Chapter 12 26/07 11 / 26
38 U= X η x σ x implies: du = 1 σ x dx ; And V = Y η y σ y implies: dv = 1 σ y dy ; James B. Ramsey () Chapter 12 26/07 11 / 26
39 U= X η x σ x implies: du = 1 σ x dx ; And V = Y η y σ y implies: dv = 1 σ y dy ; so that the change in "scale" is allowed for in that: James B. Ramsey () Chapter 12 26/07 11 / 26
40 U= X η x σ x implies: du = 1 σ x dx ; And V = Y η y σ y implies: dv = 1 σ y dy ; so that the change in "scale" is allowed for in that: du = ( du dv dx )dx ; and dv = ( dy )dy. James B. Ramsey () Chapter 12 26/07 11 / 26
41 U= X η x σ x implies: du = 1 σ x dx ; And V = Y η y σ y implies: dv = 1 σ y dy ; so that the change in "scale" is allowed for in that: du = ( du dv dx )dx ; and dv = ( dy )dy. 1 1 We multiply the density in (U, V) by the re-scaling; σ x σ y density integrates to one. so that the James B. Ramsey () Chapter 12 26/07 11 / 26
42 U= X η x σ x implies: du = 1 σ x dx ; And V = Y η y σ y implies: dv = 1 σ y dy ; so that the change in "scale" is allowed for in that: du = ( du dv dx )dx ; and dv = ( dy )dy. 1 1 We multiply the density in (U, V) by the re-scaling; σ x σ y density integrates to one. See overheads so that the James B. Ramsey () Chapter 12 26/07 11 / 26
43 The Conditional Normal Density Function For simplicity let means = 0, variances equal 1. James B. Ramsey () Chapter 12 26/07 12 / 26
44 The Conditional Normal Density Function For simplicity let means = 0, variances equal 1. The Joint distn. is: James B. Ramsey () Chapter 12 26/07 12 / 26
45 The Conditional Normal Density Function For simplicity let means = 0, variances equal 1. The Joint distn. is: φ(x, Y ) = expf 1 2(1 ρ 2 ) g[x 2 + Y 2 2ρXY ] 2π p 1 ρ 2 James B. Ramsey () Chapter 12 26/07 12 / 26
46 which can be rewritten as the product of a conditional & a marginal distn. James B. Ramsey () Chapter 12 26/07 13 / 26
47 which can be rewritten as the product of a conditional & a marginal distn. φ(x, Y ) = φ(y jx )φ(x ) = X expf 2 2(1 ρ 2 ) p g 1 expf 2(1 ρ 2 ) g[y 2 2ρXY ] p p 2π 2π 1 ρ 2 James B. Ramsey () Chapter 12 26/07 13 / 26
48 In [Y 2 2ρXY ] complete the square by adding/subtracting ρ 2 X 2 ; James B. Ramsey () Chapter 12 26/07 14 / 26
49 In [Y 2 2ρXY ] complete the square by adding/subtracting ρ 2 X 2 ; [Y 2 2ρXY + ρ 2 X 2 ] = [Y ρx ] 2 and expf X 2 2(1 ρ 2 ) g = expf X 2 ρ 2 X 2 2(1 ρ 2 ) g = expf X 2 2 g James B. Ramsey () Chapter 12 26/07 14 / 26
50 In [Y 2 2ρXY ] complete the square by adding/subtracting ρ 2 X 2 ; yields: [Y 2 2ρXY + ρ 2 X 2 ] = [Y ρx ] 2 and expf X 2 2(1 ρ 2 ) g = expf X 2 ρ 2 X 2 2(1 ρ 2 ) g = expf X 2 2 g James B. Ramsey () Chapter 12 26/07 14 / 26
51 f expf [Y 2 ρx] 2(1 ρ 2 ) p g X 2 p gfexpf p 2π 1 ρ 2 2π 2 g g James B. Ramsey () Chapter 12 26/07 15 / 26
52 f expf [Y 2 ρx] 2(1 ρ 2 ) p g X 2 p gfexpf p 2π 1 ρ 2 2π 2 g The conditional distn. is: Gaussian with conditional mean: ρx and variance (1-ρ 2 ). g James B. Ramsey () Chapter 12 26/07 15 / 26
53 The General Conditional Distribution If η x, η y are non-zero and σ x, σ y are non-unitary,then can show: James B. Ramsey () Chapter 12 26/07 16 / 26
54 The General Conditional Distribution If η x, η y are non-zero and σ x, σ y are non-unitary,then can show: σ 2 Y jx 0 = (1 ρ 2 )σ 2 y ; σ 2 X jy 0 = (1 ρ 2 )σ 2 x James B. Ramsey () Chapter 12 26/07 16 / 26
55 Most important is the conditional mean: James B. Ramsey () Chapter 12 26/07 17 / 26
56 Most important is the conditional mean: E fy jx = x 0 g = η y + ρ σ y σ x (x o η x ) = [η y ρ σ y η σ x ] + ρ σ y x o ; x σ x = α + βx 0 which is a linear relationship between Y and X; cf Chapter 5. James B. Ramsey () Chapter 12 26/07 17 / 26
57 Most important is the conditional mean: E fy jx = x 0 g = η y + ρ σ y σ x (x o η x ) = [η y ρ σ y η σ x ] + ρ σ y x o ; x σ x = α + βx 0 which is a linear relationship between Y and X; cf Chapter 5. Recall that the conditional mean of YjX is the mean of Y w.r.t. the conditional distribution, f(yjx). James B. Ramsey () Chapter 12 26/07 17 / 26
58 Moments of Bivariate Distributions Because F x (X) = R F j (X, Y )dy, F y (Y) = R F j (X, Y )dx James B. Ramsey () Chapter 12 26/07 18 / 26
59 Moments of Bivariate Distributions Because F x (X) = R F j (X, Y )dy, F y (Y) = R F j (X, Y )dx the univariate moments are all de ned as before. James B. Ramsey () Chapter 12 26/07 18 / 26
60 Moments of Bivariate Distributions Because F x (X) = R F j (X, Y )dy, F y (Y) = R F j (X, Y )dx the univariate moments are all de ned as before. The theoretical analogue to the sample covariance is: James B. Ramsey () Chapter 12 26/07 18 / 26
61 Moments of Bivariate Distributions Because F x (X) = R F j (X, Y )dy, F y (Y) = R F j (X, Y )dx the univariate moments are all de ned as before. The theoretical analogue to the sample covariance is: µ 1,1 (X, Y ) = E f(x Z Z η x )(Y η y )g (X η x )(Y η y )f (X, Y )dxdy Z Z = XYf (X, Y )dxdy η x η y James B. Ramsey () Chapter 12 26/07 18 / 26
62 If X, Y are independently distributed, E{(X-η x )(Y-η y )} = 0. James B. Ramsey () Chapter 12 26/07 19 / 26
63 If X, Y are independently distributed, E{(X-η x )(Y-η y )} = 0. µ 1,1 (X, Y ) = σ x,y = E f(x η x )(Y η y )g James B. Ramsey () Chapter 12 26/07 19 / 26
64 If X, Y are independently distributed, E{(X-η x )(Y-η y )} = 0. µ 1,1 (X, Y ) = σ x,y = E f(x η x )(Y η y )g is the theoretical covariance. James B. Ramsey () Chapter 12 26/07 19 / 26
65 If X and Y are not independent, but joint Gaussian; James B. Ramsey () Chapter 12 26/07 20 / 26
66 If X and Y are not independent, but joint Gaussian; η x = η y = 0, and σ x = σ y = 1 for convenience, then: James B. Ramsey () Chapter 12 26/07 20 / 26
67 If X and Y are not independent, but joint Gaussian; η x = η y = 0, and σ x = σ y = 1 for convenience, then: E fxy g = = Z Z x y Z Z x y XY φ(x, Y )dxdy XY φ(y jx )φ(x )dxdy James B. Ramsey () Chapter 12 26/07 20 / 26
68 2 Z Z X 4 expf 3 1 g[y ρx 2(1 ρ Y 2 ]2 ) p p dy 5 expf X 2 p 2 g dx x y jx 2π 1 ρ 2 2π James B. Ramsey () Chapter 12 26/07 21 / 26
69 2 Z Z X 4 expf 3 1 g[y ρx 2(1 ρ Y 2 ]2 ) p p dy 5 expf X 2 p 2 g dx x y jx 2π 1 ρ 2 2π Z X fρx g expf X 2 p 2 g dx x 2π = E fρx 2 g = ρ James B. Ramsey () Chapter 12 26/07 21 / 26
70 As variance of X is unity by assumption. James B. Ramsey () Chapter 12 26/07 22 / 26
71 As variance of X is unity by assumption. ρ measures the degree of linear association. James B. Ramsey () Chapter 12 26/07 22 / 26
72 As variance of X is unity by assumption. ρ measures the degree of linear association. If σ x and σ y are non-unitary; James B. Ramsey () Chapter 12 26/07 22 / 26
73 As variance of X is unity by assumption. ρ measures the degree of linear association. If σ x and σ y are non-unitary; E fxy g = ρσ x σ y James B. Ramsey () Chapter 12 26/07 22 / 26
74 As variance of X is unity by assumption. ρ measures the degree of linear association. If σ x and σ y are non-unitary; E fxy g = ρσ x σ y is the covariance {units of X and of Y} and ρ the correlation coe cient is dimensionless. James B. Ramsey () Chapter 12 26/07 22 / 26
75 The Sampling of Joint & Conditional Distributions In Chapter 9 discussed the sampling of univariate distributions and have explored the use of simple random sampling at length. James B. Ramsey () Chapter 12 26/07 23 / 26
76 The Sampling of Joint & Conditional Distributions In Chapter 9 discussed the sampling of univariate distributions and have explored the use of simple random sampling at length. Sampling for a joint distn. is similar: collect random samples of individuals & measure the joint observations; e.g. sample individuals and measure income & consumption, or height & weight. James B. Ramsey () Chapter 12 26/07 23 / 26
77 Sampling for conditional distributions is di erent. James B. Ramsey () Chapter 12 26/07 24 / 26
78 Sampling for conditional distributions is di erent. One can sample for height given weight, or weight given height. James B. Ramsey () Chapter 12 26/07 24 / 26
79 Sampling for conditional distributions is di erent. One can sample for height given weight, or weight given height. This can be achieved by: sampling heights & for each height sample weight; or sample for weight & for each weight,sample heights. James B. Ramsey () Chapter 12 26/07 24 / 26
80 Sampling for conditional distributions is di erent. One can sample for height given weight, or weight given height. This can be achieved by: sampling heights & for each height sample weight; or sample for weight & for each weight,sample heights. If using natural experiments, be sure which conditional distn. is being sampled. James B. Ramsey () Chapter 12 26/07 24 / 26
81 Consider some examples: Sampling I.Q. and heights; James B. Ramsey () Chapter 12 26/07 25 / 26
82 Consider some examples: Sampling I.Q. and heights; Sampling electrical output & fuel inputs; James B. Ramsey () Chapter 12 26/07 25 / 26
83 Consider some examples: Sampling I.Q. and heights; Sampling electrical output & fuel inputs; Incomes & consumption. James B. Ramsey () Chapter 12 26/07 25 / 26
84 End of Chapter 12. James B. Ramsey () Chapter 12 26/07 26 / 26
Chapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMultivariate Distributions CIVL 7012/8012
Multivariate Distributions CIVL 7012/8012 Multivariate Distributions Engineers often are interested in more than one measurement from a single item. Multivariate distributions describe the probability
More informationThis exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.
TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationLet X and Y denote two random variables. The joint distribution of these random
EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationBivariate distributions
Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationECE Lecture #9 Part 2 Overview
ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationENGG2430A-Homework 2
ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationEEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as
L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace
More informationMath 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14
Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More information5 Operations on Multiple Random Variables
EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationmatrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2
Short Guides to Microeconometrics Fall 2018 Kurt Schmidheiny Unversität Basel Elements of Probability Theory 2 1 Random Variables and Distributions Contents Elements of Probability Theory matrix-free 1
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationReview of Integration Techniques
A P P E N D I X D Brief Review of Integration Techniques u-substitution The basic idea underlying u-substitution is to perform a simple substitution that converts the intergral into a recognizable form
More information18 Bivariate normal distribution I
8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationSTAT 516 Midterm Exam 3 Friday, April 18, 2008
STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More informationJoint Gaussian Graphical Model Review Series I
Joint Gaussian Graphical Model Review Series I Probability Foundations Beilun Wang Advisor: Yanjun Qi 1 Department of Computer Science, University of Virginia http://jointggm.org/ June 23rd, 2017 Beilun
More informationHomework 10 (due December 2, 2009)
Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationClass 8 Review Problems solutions, 18.05, Spring 2014
Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationCourse on Inverse Problems
Stanford University School of Earth Sciences Course on Inverse Problems Albert Tarantola Third Lesson: Probability (Elementary Notions) Let u and v be two Cartesian parameters (then, volumetric probabilities
More informationECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1
EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation
More informationChapter 6: Rational Expr., Eq., and Functions Lecture notes Math 1010
Section 6.1: Rational Expressions and Functions Definition of a rational expression Let u and v be polynomials. The algebraic expression u v is a rational expression. The domain of this rational expression
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More informationREVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)
REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationThe Distributions of Sums, Products and Ratios of Inverted Bivariate Beta Distribution 1
Applied Mathematical Sciences, Vol. 2, 28, no. 48, 2377-2391 The Distributions of Sums, Products and Ratios of Inverted Bivariate Beta Distribution 1 A. S. Al-Ruzaiza and Awad El-Gohary 2 Department of
More informationMath 180B Problem Set 3
Math 180B Problem Set 3 Problem 1. (Exercise 3.1.2) Solution. By the definition of conditional probabilities we have Pr{X 2 = 1, X 3 = 1 X 1 = 0} = Pr{X 3 = 1 X 2 = 1, X 1 = 0} Pr{X 2 = 1 X 1 = 0} = P
More informationInformation geometry for bivariate distribution control
Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationStatistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }
Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationTwo-dimensional Random Vectors
1 Two-dimensional Random Vectors Joint Cumulative Distribution Function (joint cd) [ ] F, ( x, ) P xand Properties: 1) F, (, ) = 1 ),, F (, ) = F ( x, ) = 0 3) F, ( x, ) is a non-decreasing unction 4)
More informationInteresting Probability Problems
Interesting Probability Problems Jonathan Mostovoy - 4665 University of Toronto August 9, 6 Contents Chapter Questions a).8.7............................................ b)..............................................
More informationProblem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},
ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationMultivariate Distribution Models
Multivariate Distribution Models Model Description While the probability distribution for an individual random variable is called marginal, the probability distribution for multiple random variables is
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationThree hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.
Three hours To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER EXTREME VALUES AND FINANCIAL RISK Examiner: Answer QUESTION 1, QUESTION
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationMC3: Econometric Theory and Methods. Course Notes 4
University College London Department of Economics M.Sc. in Economics MC3: Econometric Theory and Methods Course Notes 4 Notes on maximum likelihood methods Andrew Chesher 25/0/2005 Course Notes 4, Andrew
More informationSTAT:5100 (22S:193) Statistical Inference I
STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap
More informationThe Multivariate Gaussian Distribution [DRAFT]
The Multivariate Gaussian Distribution DRAFT David S. Rosenberg Abstract This is a collection of a few key and standard results about multivariate Gaussian distributions. I have not included many proofs,
More informationProbability. Table of contents
Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous
More informationProblem. Set up the definite integral that gives the area of the region. y 1 = x 2 6x, y 2 = 0. dx = ( 2x 2 + 6x) dx.
Wednesday, September 3, 5 Page Problem Problem. Set up the definite integral that gives the area of the region y x 6x, y Solution. The graphs intersect at x and x 6 and y is the uppermost function. So
More informationA Modification of Linfoot s Informational Correlation Coefficient
Austrian Journal of Statistics April 07, Volume 46, 99 05. AJS http://www.ajs.or.at/ doi:0.773/ajs.v46i3-4.675 A Modification of Linfoot s Informational Correlation Coefficient Georgy Shevlyakov Peter
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More informationHomework 9 (due November 24, 2009)
Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationMath 265 (Butler) Practice Midterm III B (Solutions)
Math 265 (Butler) Practice Midterm III B (Solutions). Set up (but do not evaluate) an integral for the surface area of the surface f(x, y) x 2 y y over the region x, y 4. We have that the surface are is
More informationStat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota
Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution Charles J. Geyer School of Statistics University of Minnesota 1 The Dirichlet Distribution The Dirichlet Distribution is to the beta distribution
More informationRegression. Econometrics II. Douglas G. Steigerwald. UC Santa Barbara. D. Steigerwald (UCSB) Regression 1 / 17
Regression Econometrics Douglas G. Steigerwald UC Santa Barbara D. Steigerwald (UCSB) Regression 1 / 17 Overview Reference: B. Hansen Econometrics Chapter 2.20-2.23, 2.25-2.29 Best Linear Projection is
More informationMultivariate Gaussian Distribution. Auxiliary notes for Time Series Analysis SF2943. Spring 2013
Multivariate Gaussian Distribution Auxiliary notes for Time Series Analysis SF2943 Spring 203 Timo Koski Department of Mathematics KTH Royal Institute of Technology, Stockholm 2 Chapter Gaussian Vectors.
More information9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown
9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown I. Objectives Lecture 5: Conditional Distributions and Functions of Jointly Distributed Random Variables
More informationHW4 : Bivariate Distributions (1) Solutions
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y
More informationSpeci cation of Conditional Expectation Functions
Speci cation of Conditional Expectation Functions Econometrics Douglas G. Steigerwald UC Santa Barbara D. Steigerwald (UCSB) Specifying Expectation Functions 1 / 24 Overview Reference: B. Hansen Econometrics
More informationMultivariate distributions
CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationTutorial 2: Comparative Statics
Tutorial 2: Comparative Statics ECO42F 20 Derivatives and Rules of Differentiation For each of the functions below: (a) Find the difference quotient. (b) Find the derivative dx. (c) Find f (4) and f (3)..
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationJoint Probability Distributions, Correlations
Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more
More informationThe Binomial distribution. Probability theory 2. Example. The Binomial distribution
Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated
More informationMath 510 midterm 3 answers
Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e
More informationf X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx
INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't
More informationOn the Expected Absolute Value of a Bivariate Normal Distribution
Journal of Statistical Theory and Applications Volume, Number 4, 0, pp. 37-377 ISSN 538-7887 On the Expected Absolute Value of a Bivariate Normal Distribution S. Reza H. Shojaie, Mina Aminghafari and Adel
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate
More informationRandom Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.
Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationGeneral Random Variables
Chater General Random Variables. Law of a Random Variable Thus far we have considered onl random variables whose domain and range are discrete. We now consider a general random variable X! defined on the
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationThis does not cover everything on the final. Look at the posted practice problems for other topics.
Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationChapter 9: Elementary Sampling Theory
Chapter 9: Elementary Sampling Theory James B. Ramsey Economics; NYU 2007-2-3 Ramsey (Institute) Chapter 9: 2007-2-3 1 / 20 Sampling Theory is the LINK between Theory & Observation Chapters 1 to 5: Data
More information