Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Size: px
Start display at page:

Download "Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type."

Transcription

1 Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations, then E(X 1 + X + + X n )=E(X 1 )+E(X )+ + E(X n ) The X i can be continuous or discrete or of any other type. The expectation on the left-hand-side is with with respect to the joint distribution of X 1,...,X n. The ith expectation on the right-hand-side is with with respect to the marginal distribution of X i, i =1,...,n. STAT/MTHE 353: 4 - More on Expectations 1/ 37 STAT/MTHE 353: 4 - More on Expectations / 37 Often we can write a r.v. X as a sum of simpler random variables. Then E(X) is the sum of the expectation of these simpler random variables. Example: Consider (X 1,...,X r ) having multinomial distribution with parameters n and (p 1,...,p r ). Compute E(X i ), i =1,...,r Example: Let (X 1,...,X r ) the multivariate hypergeometric distribution with parameters N and n 1,...,n r. Compute E(X i ), i =1,...,r Example: We have two urns. Initially Urn 1 contains n red balls and Urn contains n blue balls. At each stage of the experiment we pick a ball from Urn 1 at random, also pick a ball from Urn at random, and then swap the balls. Let X = # of red balls in Urn 1 after k stages. Compute E(X) for even k. Example: (Matching problem) If the integers 1,,...,n are randomly permuted, what is the probability that integer i is in the ith position? What is the expected number of integers in the correct position? STAT/MTHE 353: 4 - More on Expectations 3/ 37 STAT/MTHE 353: 4 - More on Expectations 4/ 37

2 Conditional Expectation Suppose X =(X 1,...,X n ) T and =( 1,..., m ) T are two vector random variables defined on the same probability space. The distributions (joint marginals) of X and can be described the pdfs f X (x) and f (y) (if both X and are continuous) or by the pmfs p X (x) and p (y) (if both are discrete). The joint distribution of the pair (X, ) can be described by their joint pdf f X, (x, y) or joint pmf p X, (x, y). The conditional distribution of X given = y is described by either the conditional pdf or the conditional pmf f X (x y) = f X, (x, y) f (y) p X (x y) = p X, (x, y) p (y) STAT/MTHE 353: 4 - More on Expectations 5/ 37 Remarks: (1) In general, X and can have di erent types of distribution (e.g., one is discrete, the other is continuous). Example: Let n = m =1and X = + Z, where is a Bernoulli(p) r.v.andz N(0, ),and and Z are independent. Determine the conditional pdf of X given =0and =1. Also, determine the pdf of X. () Not all random variables are either discrete or continuous. Mixed discrete-continuous and even more general distributions are possible, but they are mostly out of the scope of this course. STAT/MTHE 353: 4 - More on Expectations 6/ 37 Definitions (1) The conditional expectation of X given = y is the mean (expectation) of the distribution of X given = y and is denoted by E(X = y). () The conditional variance of X given = y is the the variance of the distribution of X given = y and is denoted by Var(X = y). If both X and are discrete, Special case: Assume X and are independent. Then (considering the discrete case) p X (x y) =p X (x) so that for all y, E(X = y) = X x xp X (x y) E(X = y) = X x xp X (x y) = X x xp X (x) =E(X) and Var(X = y) = X x E(X = y) p X (x y) x In case both X and are continuous, wehave and Var(X = y) = E(X = y) = Z 1 Z 1 1 xf X (x y) dx x E(X = y) f X (x y) dx 1 STAT/MTHE 353: 4 - More on Expectations 7/ 37 A similar argument shows E(X = y) = E(X) if X and are independent continuous random variables. STAT/MTHE 353: 4 - More on Expectations 8/ 37

3 Notation: Let g(y) = E(X = y). We define the random variable E(X ) by setting E(X )=g( ) Similarly, letting h(y) = Var(X = y), the random variable Var(X ) is defined by Var(X )=h( ) The following are important properties of conditional expectation. We don t prove them formally, but they should be intuitively clear. Properties (i) (Linearity of conditional expectation) IfX 1 and X are random variables with finite expectations, then for all a, b R, E(aX 1 + bx )=ae(x 1 )+be(x ) For example, if X and are independent, then E(X = y) = E(X) (constant function), so E(X )=E(X) (ii) If g : R! R is a function such that E[g( )] is finite, then and if E g( )X is finite, then E g( ) = g( ) E g( )X = g( )E(X ) STAT/MTHE 353: 4 - More on Expectations 9/ 37 STAT/MTHE 353: 4 - More on Expectations 10 / 37 Theorem 1 (Law of total expectation) E(X) =E E(X ) Proof: Assume both X and are discrete. Then E E(X ) = X E(X = y)p (y) = X X xp X (x y) p (y) y y x = X X x p X, (x, y) p (y) = X X xp X, (x, y) p y x (y) y x = X x xp X (x) =E(X) Example: Expected value of geometric distribution... Lemma (Variance formula) Var(X) =E Var(X ) + Var E(X ) Proof: Since Var(X = y) is the variance of the conditional distribution of X given = y, Var(X )=E[X ] E[X ] Taking expectation (with respect to ), E Var(X ) = E E[X ]) E E[X ] = E(X ) E E[X ] On the other hand, Var E[X ] = E E[X ] so E E(X ) = E E[X ] E(X) Var(X) =E(X ) E(X) = E Var(X ) + Var E(X ) STAT/MTHE 353: 4 - More on Expectations 11 / 37 STAT/MTHE 353: 4 - More on Expectations 1 / 37

4 Remarks: (1) Let A be an event and X the indicator of A: 8 < 1 if A occurs X = : 0 if A c occurs () The law of total expectation says that we can compute the mean of a distribution by conditioning on another random variable. This distribution can be a conditional distribution. For example, for r.v. s X,,andZ, E(X = y) =E E(X = y, Z) = y Then E(X) =P (A). Assuming is a discrete r.v., we have E(X = y) =P (A = y) and the law of total expectation states so that E(X )=E E(X,Z) P (A) =E(X) = X y E(X = y)p (y) = X y P (A = y)p (y) For example, if Z is discrete, which is the law of total probability. For continuous we have P (A) = Z 1 E(X = y)f (y) dy = Z P (A = y)f (y) dy E(X = y) = X E(X = y, Z = z)p Z (z y) z = X E(X = y, Z = z)p (Z = z = y) z Exercise: Prove the above statement if X,,andZ are discrete. STAT/MTHE 353: 4 - More on Expectations 13 / 37 STAT/MTHE 353: 4 - More on Expectations 14 / 37 Example: Repeatedly flip a biased coin which comes up heads with probability p. Let X denote the number of flips until consecutive heads occur. Find E(X). Solution: Example: (Simplex algorithm) There are n vertices (points) that are ranked from best to worst. Start from point j and at each step, jump to one of the better points at random (with equal probability). What is the expected number of steps to reach the best point? Solution: Minimum mean square error (MMSE) estimation Suppose a r.v. is observed and based on its value we want to guess the value of another r.v. X. Formally, we want to use a function g( ) of to estimate the unobserved X in the sense of minimizing the mean square error E (X g( )) It turns out that g ( )=E(X ) is the optimal choice. Theorem 3 Suppose X has finite variance. Then for g ( )=E(X ) and any function g E (X g( )) E (X g ( )) STAT/MTHE 353: 4 - More on Expectations 15 / 37 STAT/MTHE 353: 4 - More on Expectations 16 / 37

5 Proof: E (X Use the properties of conditional expectation: g( )) = E (X g ( )+g ( ) g( )) = E (X g ( )) +(g ( ) g( )) + (X g ( ))(g ( ) g( )) = E (X g ( )) + E (g ( ) g( )) +E (X g ( ))(g ( ) g( )) = E (X g ( )) +(g ( ) g( )) + (g ( ) g( ))E X g ( ) = E (X g ( )) +(g ( ) g( )) + (g ( ) g( )) E(X ) g ( ) {z } =0 = E (X g ( )) +(g ( ) g( )) Proof cont d Thus E (X g( )) = E (X g ( )) +(g ( ) g( )) Take expectations on both sides and use the law of total expectation to obtain E (X g( )) = E (X g ( )) + E g ( ) g( )) Since g ( ) g( ) 0,thisimplies E (X g( )) E (X g ( )) STAT/MTHE 353: 4 - More on Expectations 17 / 37 STAT/MTHE 353: 4 - More on Expectations 18 / 37 Random Sums Remark: Note that since g (y) =E(X = y), wehave E Var(X ) = E (X g ( )) i.e., E Var(X ) is the mean square error of the MMSE estimate of X given. Example: Suppose X N(0, X ) and Z N(0, Z ),wherex and Z are independent. Here X represents a signal sent from a remote location which is corrupted by noise Z so that the received signal is = X + Z. What is the MMSE estimate of X given = y? Theorem 4 (Wald s equation) Let X 1,X... be i.i.d. random variables with mean µ. LetN be r.v. with values in {1,,...} that is independent of the X i s and has finite mean E(N). DefineX = P N X i. Then Proof: E(X) =E(N)µ E(X N = n) = E(X X N N = n) = E(X X n N = n) = E(X 1 N = n)+ + E(X n N = n) (linearity of expectation) = E(X 1 )+ + E(X n ) (N and X i are independent) = nµ STAT/MTHE 353: 4 - More on Expectations 19 / 37 STAT/MTHE 353: 4 - More on Expectations 0 / 37

6 Covariance and Correlation Proof cont d: We obtained E(X N = n) =nµ for all n =1,,...,i.e, E(X N) =Nµ. By the law of total expectation E(X) =E E(X N) = E(Nµ)=E(N)µ Example: (Branching Process) Suppose a population evolves in generations starting from a single individual (generation 0). Each individual of the ith generation produces a random number of o springs; the collection of all o springs by generation i individuals forms generation i +1. The number of o springs born to distinct individuals are independent random variables with mean µ. LetX n be the number of individuals in the nth generation. Find E(X n ). Covariance Definition Let X and be two random variables with finite variance. Their covariance is defined by Cov(X, )=E (X E(X))( E( )) Properties: (1) Cov(X, ) = E(X ) E E(X) = E(X ) E(X)E( )+E(X)E( ) = E(X ) E(X)E( ) E XE( ) + E E(X)E( ) The formula Cov(X, )=E(X ) computations. E(X)E( ) is often useful in STAT/MTHE 353: 4 - More on Expectations 1 / 37 STAT/MTHE 353: 4 - More on Expectations / 37 () Cov(X, )=Cov(,X). (3) If X = we obtain (5) If X and are independent, then Cov(X, )=0. Proof: By independence, E(X )=E(X)E( ), so Cov(X, )=E (X E(X)) ) = Var(X) Cov(X, )=E(X ) E(X)E( )=0 (4) For any constants a, b, c and d, Cov(aX + b, c + d) = E (ax + b E(aX + b))(c + d E(c + d)) = E a(x E(X))c( E( )) = ace (X E(X))( E( )) = ac Cov(X, ) Definition Let X 1,...,X n be random variables with finite variances. The covariance matrix of the vector X =(X 1,...,X n ) T is the n n matrix Cov(X) whose (i, j)th entry is Cov(X i,x j ). Remarks: The ith diagonal entry of Cov(X) is Var(X i ), i =1,...,n Cov(X) is a symmetric matrix since Cov(X i,x j )=Cov(X j,x i ) for all i and j. STAT/MTHE 353: 4 - More on Expectations 3 / 37 STAT/MTHE 353: 4 - More on Expectations 4 / 37

7 Some properties of covariance are easier to derive using a matrix formalism. Let V = { ij ; i =1,..., m, j =1,...,n} be an m n matrix of random variables having finite expectations. We define E(V ) by taking expectations componentwise: n E( 11 )... E( 1n ) 1... n E(V )=E = E( 1 )... E( n ) m1... mn E( m1 )... E( mn ) Now notice that the n n matrix (X E(X))(X E(X)) T has (X i E(X i ))(X j E(X j )) in its (i, j)th entry. Thus Cov(X) =E (X E(X))(X E(X)) T Lemma 5 Let A be an m n real matrix and define = AX (an m-dimensional random vector). Then Cov( )=A Cov(X)A T Proof: First note that by the linearity of expectation, Thus E( )=E(AX) =AE(X). Cov( ) = E ( E( ))( E( )) T = E (AX AE(X))(AX AE(X)) T = E (A(X E(X)))(A(X E(X))) T = E A(X E(X))(X E(X))) T A T = AE (X E(X))(X E(X))) T A T = A Cov(X)A T STAT/MTHE 353: 4 - More on Expectations 5 / 37 STAT/MTHE 353: 4 - More on Expectations 6 / 37 For any m-vector c =(c 1,...,c m ) T we also have Cov( + c) =Cov( ) since Cov( i + c i, j + c j )=Cov( i, j ). Thus Cov(AX + c) =A Cov(X)A T Let m =1so that c = c is a scalar and A is and 1 n matrix, i.e., A is arowvectora = a T =(a 1,...,a n ). Then X n Cov(a T X + c) =Cov X n a i X i + c = Var a i X i + c On the other hand, Cov(a T X + c) = a T Cov(X)a = = = Hence X n Var a i X i + c = j=1 a i a j Cov(X i,x j ) a i a j Cov(X i,x j ) a i Cov(X i,x i )+X i<j a i a j Cov(X i,x j ) a i Var(X i )+X i<j a i Var(X i )+X i<j a i a j Cov(X i,x j ) Note that if X 1,...,X n are independent, thencov(x i,x j )=0for i 6= j, and we obtain X n Var a i X i + c = a i Var(X i ) STAT/MTHE 353: 4 - More on Expectations 7 / 37 STAT/MTHE 353: 4 - More on Expectations 8 / 37

8 More generally, let X =(X 1,...,X n ) T and =( 1,..., m ) T and let Cov(X, ) be the n m matrix with (i, j)th entry Cov(X i, j ). Note that Cov(X, )=E (X E(X))( E( )) T If A is a k n matrix, B is an l m matrix, c is a k-vector and d is an l-vector, then Cov(AX + c, B + d) = E (AX + c E(AX + c))(b + d E(B + d)) T We obtain = AE (X E(X))( E( )) T B T Cov(AX + c, B + d) =A Cov(X, )B T We can now prove the following important property of covariance: Lemma 6 For any constants a 1,...,a n and b 1,...,b m, X n Cov a i X i, i.e., Cov(X, ) is bilinear. mx b j j = j=1 j=1 mx a i b j Cov(X i, j ) Proof: Let k = l =1and A = a T =(a 1,...,a n ) and B = b T =(b 1,...,b m ). Then we have X n Cov a i X i, mx b j j j=1 = Cov(a T X, b T ) =Cov(AX, B ) = A Cov(X, )B T = a T Cov(X, )b mx = a i b j Cov(X i, j ) j=1 STAT/MTHE 353: 4 - More on Expectations 9 / 37 STAT/MTHE 353: 4 - More on Expectations 30 / 37 The following property of covariance is of fundamental importance: Lemma 7 Cov(X, ) apple p Var(X) Var( ) Proof: First we prove the Cauchy-Schwarz inequality for random variables U and V with finite variances. Let R, then 0 apple E (U V ) = E(U UV + V ) = E(U ) E(UV)+ E(V ) This is a quadratic polynomial in roots. which cannot have two distinct real Proof cont d: Thus its discriminant cannot be positive: 4 E(UV) 4E(U )E(V ) apple 0 so we obtain [E(UV)] apple E(U )E(V ) Use this with U = X E(X) and V = E( ) to get Cov(X, ) = E (X E(X))( E( )) apple qe (X E(X)) E ( E( )) = p Var(X) Var( ) STAT/MTHE 353: 4 - More on Expectations 31 / 37 STAT/MTHE 353: 4 - More on Expectations 3 / 37

9 Correlation Recall that Cov(aX, b )=ab Cov(X, ). Thisisanundesirable property if we want to use Cov(X, ) as a measure of association between X and. A proper normalization will solve this problem: Definition The correlation coe variances is defined by Remarks: (X, )= Since Var(aX + b) =a Var(X), cient between X and having nonzero Cov(X, ) p Var(X) Var( ) (ax + b, a + d) = (X, ) Letting µ X = E(X), µ = E( ), X = Var(X), = Var( ), we have (X, ) = Cov(X, ) = Cov(X µ X, µ ) X X X µx = Cov, µ X Thus (X, ) is the covariance between the standardized versions of X and. If X and are independent, then Cov(X, )=0, so (X, )=0. On the other hand, (X, )=0does not imply that X and are independent. Remark: If (X, )=0we say that X and are uncorrelated. Example: Find random variables X and that are uncorrelated but not independent. Example: Covariance and correlation for multinomial random variables... STAT/MTHE 353: 4 - More on Expectations 33 / 37 STAT/MTHE 353: 4 - More on Expectations 34 / 37 Theorem 8 The correlation always satisfies (X, ) apple 1 Moreover, (X, ) =1if and only if = ax + b for some constants a and b (a 6= 0), i.e., is an a ne function of X. Proof: We know that Cov(X, ) apple p Var(X) Var( ), so (X, ) apple 1 always holds. Let s assume now that = ax + b, wherea 6= 0. Then Cov(X, )=Cov(X, ax+b) =Cov(X, ax) = a Cov(X, X) = avar(x) so (X, )= a Var(X) p Var(X)a Var(X) = p a = ±1 a STAT/MTHE 353: 4 - More on Expectations 35 / 37 Proof cont d: Conversely, suppose that (X, )=1. Then X X Var = Cov, X X X X X = Var + Var This means that X X X = Var(X) + Var( ) X = 1+1 =0 X Cov, X Cov(X, ) X = c for some constant c, so = X X c If (X, )= 1, consider Var X X + and use the same proof Remark: The previous theorem implies that correlation can be thought of as a measure of linear association (linear dependence) between X and. Recall the multinomial example... STAT/MTHE 353: 4 - More on Expectations 36 / 37

10 Example: (Linear MMSE estimation) Let X and be random variables with zero means and finite variances X > 0 and > 0. Suppose we want to estimate X in the MMSE sense using a linear function of ;i.e., we are looking for a R minimizing E (X a ) Find the minimizing a and determine the resulting minimum mean square error. Relate the results to (X, ). STAT/MTHE 353: 4 - More on Expectations 37 / 37

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Moment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution

Moment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution Moment Generating Function STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution T. Linder Queen s University Winter 07 Definition Let X (X,...,X n ) T be a random vector and

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Random Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30

Random Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30 Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

2. Matrix Algebra and Random Vectors

2. Matrix Algebra and Random Vectors 2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

The mean, variance and covariance. (Chs 3.4.1, 3.4.2) 4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student

More information

Stat 206: Sampling theory, sample moments, mahalanobis

Stat 206: Sampling theory, sample moments, mahalanobis Stat 206: Sampling theory, sample moments, mahalanobis topology James Johndrow (adapted from Iain Johnstone s notes) 2016-11-02 Notation My notation is different from the book s. This is partly because

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

18.440: Lecture 25 Covariance and some conditional expectation exercises

18.440: Lecture 25 Covariance and some conditional expectation exercises 18.440: Lecture 25 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Outline Covariance and correlation A property of independence If X and Y

More information

The Multivariate Gaussian Distribution

The Multivariate Gaussian Distribution The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

18.600: Lecture 24 Covariance and some conditional expectation exercises

18.600: Lecture 24 Covariance and some conditional expectation exercises 18.600: Lecture 24 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Paradoxes: getting ready to think about conditional expectation Outline Covariance

More information

Final Review: Problem Solving Strategies for Stat 430

Final Review: Problem Solving Strategies for Stat 430 Final Review: Problem Solving Strategies for Stat 430 Hyunseung Kang December 14, 011 This document covers the material from the last 1/3 of the class. It s not comprehensive nor is it complete (because

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

1.12 Multivariate Random Variables

1.12 Multivariate Random Variables 112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

Solutions and Proofs: Optimizing Portfolios

Solutions and Proofs: Optimizing Portfolios Solutions and Proofs: Optimizing Portfolios An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Covariance Proof: Cov(X, Y) = E [XY Y E [X] XE [Y] + E [X] E [Y]] = E [XY] E [Y] E

More information

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms.

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms. Midterm Review Solutions for MATH 50 Solutions to the proof and example problems are below (in blue). In each of the example problems, the general principle is given in parentheses before the solution.

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

The Hilbert Space of Random Variables

The Hilbert Space of Random Variables The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Sta230/Mth230 Colin Rundel February 5, 2014 We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean Random

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information