Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Size: px
Start display at page:

Download "Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:"

Transcription

1 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained easily from the joint distribution: F X (x) = P (X x) = P (X x,y < ) = F X,Y (x, ), and F Y (y ) = F X,Y (, y ). Covariance and correlation of X and Y : Cov(X,Y ) = E {(X E X )(Y EY )} = E (XY ) (E X )(EY ), Corr(X,Y ) = Cov(X,Y ) / Var(X )Var(Y ).

2 Discrete bivariate distributions If X takes discrete values x 1,, x m andy takes discrete values y 1,, y n, their joint probability function may be presented in a table: X \Y y 1 y 2 y n x 1 p 11 p 12 p 1n p 1 x 2 p 21 p 22 p 2n p 2 x m p m1 p 22 p mn p m p 1 p 2 p n where p ij = P (X = x i,y = y j ), and p i = P (X = x i ) = p j = P (Y = y j ) = n P (X = x i,y = y j ) = p ij, j =1 m P (X = x i,y = y j ) = p ij. i =1 j i

3 In general, p ij p i p j. However if p ij = p i p j for all i and j, X and Y are independent, i.e. P (X = x i,y = y j ) = P (X = x i ) P (Y = y j ), i, j. For independent X and Y, Cov(X,Y ) = 0. Example 1. Flip a fair coin two times. Let X = 1 if H occurs in the first flip, and 0 if T occurs in the first flip. Let Y = 1 if the outcomes in the two flips are the same, and 0 if the two outcomes are different. The joint probability function is X \Y /4 1/4 1/2 0 1/4 1/4 1/2 1/2 1/2 It is easy to see that X and Y are independent, which is a bit anti-intuitive.

4 Continuous bivariate distribution If the CDF F X,Y can be written as F X,Y (x, y ) = y x f X,Y (u, v )dudv for any x and y, where f X,Y 0, (X,Y ) has a continuous joint distribution, and f X,Y (x, y ) is the joint PDF. As F X,Y (, ) = 1, it holds that f X,Y (u, v )dudv = 1. In fact any non-negative function satisfying this condition is a PDF. Furthermore for any subset A in R 2, P {(X,Y ) A} = f X,Y (x, y )dxdy. A

5 Also Cov(X,Y ) = = (x E X )(y EY )f X,Y (x, y )dxdy x yf X,Y (x, y )dxdy E X EY. Note that F X (x) = F X,Y (x, ) = x f X,Y (u, v )dudv = x { f X,Y (u, v )dv } du, hence the marginal PDF of X can be derived from the joint PDF as follows f X (x) = Similarly, f y (y ) = f X,Y (x, y )dx. f X,Y (x, y )dy.

6 Note. Different from discrete cases, it is not always easy to work out marginal PDFs from joint PDFs, especially when PDFs are discontinuous. When f X,Y (x, y ) = f X (x)f y (y ) for any x and y, X and Y are independent, as then P (X x,y y ) = = x f X (u)du y and also Cov(X,Y ) = 0. y x f X,Y (u, v )dudv = y f Y (v )dv = P (X x)p (Y y ), x f X (u)f Y (v )dudv Example 2. Uniform distribution on unit square U [0, 1] 2. f (x, y ) = { 1 0 x 1, 0 y 1 0 otherwise.

7 This is well-defined PDF, as f 0 and f (x, y )dxdy = 1. It is easy to see that X and Y are independent. Let us calculate some probabilities P (X < 1/2,Y < 1/2) = F (1/2, 1/2) = P (X + Y < 1) = = = {x+y <1} 1 1 y 0 dy 1/2 1/ /2 1/2 dxdy = 1/4. f X,Y (x, y )dxdy = dx = 1 0 f X,Y (x, y )dxdy {x >0, y >0, x+y <1} (1 y )dy = 1/2. dxdy Example 3. Let (X,Y ) have the joint PDF f (x, y ) = { x 2 + x y /3 0 x 1, 0 y 2, 0 otherwsie.

8 Calculate P (0 < X < 1/2, 1/4 < Y < 3) and P (X < Y ). Are X and Y independent with each other? = P (0 < X < 1/2, 1/4 < Y < 3) = P (0 < X < 1/2, 1/4 < Y < 2) 2 1/2 dy (x 2 + x y 2 3 )dx = 1 + y 1.75 dy = y /4 = /4 0 1/4 P (X < Y ) = 1 0 dx 2 x (x 2 + x y 3 )dy = 1 0 ( 2 3 x +2x x 3 )dx = 17/24 = f X (x) = 2 0 (x 2 + x y 3 )dy = 2x 2 + 2x 3, f Y (y ) = Both f X (x) and f Y (y ) are well-defined PDFs. 1 0 (x 2 + x y 3 )dx = y 6. But f (x, y ) f X (x)f Y (y ), hence they are not independent.

9 4.2 Conditional Distributions If X and Y are not independent, knowing X should be helpful in determining Y, as X may carry some information on Y. Therefore it makes sense to define the distribution of Y given, say, X = x. This is the concept of conditional distributions. If both X and Y are discrete, the conditional probability function is simply a special case of conditional probabilities: P (Y = y X = x) = P (Y = y, X = x) / P (X = x). However this definition does not extend to continuous r.v.s, as then P (X = x) = 0.

10 Definition (Conditional PDF). For continuous r.v.s X and Y, the conditional PDF of Y given X = x is f Y X ( x) = f X,Y (x, )/f X (x). Remark. (i) As a function of y, f Y X (y x) is a PDF: P (Y A X = x) = f Y X (y x)dy, while x is treated as a constant (i.e. not a variable). A (ii) E (Y X = x) = yf Y X (y x)dy is a function of x, and Var(Y X = x) = {y E (Y X = x)} 2 f Y X (y x)dy.

11 (iii) If X and Y are independent, f Y X (y x) = f Y (y ). (iv) f X,Y (x, y ) = f X (x)f Y X (y x) = f X Y (x y )f Y (y ), which offers alternative ways to determine the joint PDF. (v) E {E (Y X )} = E (Y ) This in fact holds for any r.v.s X and Y. We give a proof here for continuous r.v.s only: { E {E (Y X )} = yf Y X (y x)dy } f X (x)dx = yf X,Y (x, y )dxdy = y { f X,Y (x, y )dx } dy = yf Y (y )dy = EY. Example 4. Let f X,Y (x, y ) = e y for 0 < x < y <, and 0 otherwise. Find f Y X (y x), f X Y (x y ) and Cov(X,Y ).

12 We need to find f X (x), f Y (y ) first: f X (x) = f X,Y (x, y )dy = Hence f Y (y ) = f X,Y (x, y )dx = f Y X (y x) = f X,Y (x, y ) f X (x) x y f X Y (x y ) = f X,Y (x, y ) f Y (y ) 0 e y dy = e x e y dx = y e y x (0, ), y (0, ). = e (y x) y (x, ), = 1/y x (0, y ). Note that given Y = y, X U (0, y ), i.e. the uniform distribution on (0, y ).

13 To find Cov(X,Y ), we compute E X, EY and E (XY ) first. E X = xf X (x)dx = xe x dx = e x (1 + x) 0 = 1, 0 EY = yf Y (y )dy = y 2 e y dy = y 2 e y y e y dy = 2, 0 0 y E (XY ) = x yf X,Y (x, y )dxdy = dy x y e y dx = 1 y 3 e y dy 2 = 1 2 y 3 e y y 2 e y dy = 3. Hence Cov(X,Y ) = E (XY ) (E X )(EY ) = 3 2 = 1. 0

14 4.3 Multivariate Distributions Let X = (X 1,, X n ) be a random vector (r.v.) consisting of n r.v.s. The joint CDF is defined as F (x 1,, x n ) F X1,,X n (x 1,, x n ) = P (X 1 x 1,, X n x n ). If X is continuous, its PDF f satisfies F (x 1,, x n ) = xn x1 In general, the PDF admits the factorisation f (u 1,, u n )du 1 du n. f (x 1,, x n ) = f (x 1 )f (x 2 x 1 )f (x 3 x 1, x 2 ) f (x n x 1,, x n 1 ), where f (x j x 1,, x j 1 ) denotes the conditional PDF of X j given X 1 = x 1,, X j 1 = x j.

15 However, when X 1,, X n are independent, f X1,,X n (x 1,, x n ) = f X1 (x 1 ) f Xn (x n ). IID Samples. If X 1,, X n are independent and each has the same CDF F, we say that X 1,, X n are IID (independent and identically distributed) and write X 1,, X n ii d F. We also call X 1,, X n a sample or a random sample. 4.3 Two important multivariate distributions Multinomial Distribution Multinomial(n, p 1,, p k ) an extension of Bin(n, p).

16 Suppose we threw a k -sided die n times, record X i as the number of times ended with the i -th side, i = 1,, k. Then (X 1,, X k ) Multinomial(n, p 1,, p k ), where p i is the probability of the event that the i -th side occurs in one threw. Obviously p i 0 and i p i = 1. We may immediately make the following observation from the above definition. (i) X X k n, therefore X 1,, X n are not independent. (ii) X i Bin(n, p i ), hence E X i = np i and Var(X i ) = np i (1 p i ).

17 The joint probability function for Multinomial(n, p 1,, p k ): For any j 1,, j k 0 and j j k = n, P (X 1 = j 1,, X k = j k ) = n! j 1! j k! pj 1 1 p j k k.

18 Multivariate Normal Distribution N (µ, Σ): a k -variable r.v. X = (X 1,, X k ) is normal with mean µ and covariance matrix Σ if its PDF is f (x) = 1 (2π) k /2 Σ 1/2 exp { 1 2 (x µ) Σ 1 (x µ) } x R k, where µ is k -vector, and Σ is a k k positive-definite matrix. Some properties of N (µ, Σ): Let µ = (µ 1,, µ k ) and Σ (σ ij ), then (i) E X = µ, and the covariance matrix Cov(X) = E {(X µ)(x µ) } = Σ, and σ ij = Cov(X i, X j ) = E {(X i µ i )(X j µ j )}.

19 (ii) When σ ij = 0 for all i j, i.e. the components of X are uncorrelated, Σ = diag(σ 11,, σ k k ), Σ = i σ ii. Hence the PDF admits a simple form f (x) = k i =1 1 exp{ 1 (x 2πσii 2σ i µ i ) 2 }. ii Thus X 1,, X n are independent when σ ij = 0 for all i j. (iii) Let Y = AX + b, where A is a constant matrix and b is a constant vector. Then Y N (Aµ + b, AΣA ). (iv) X i N (µ i, σ ii ). For any constant k -vector a, a X is a scale r.v. and a X N (a µ, a Σa). (v). Standard Normal Distribution: N (0, I k ), where I k is the k k identity matrix.

20 Example 5. Let X 1, X 2, X 3 be jointly normal with the common mean 0, variance 1 and Corr(X i, X j ) = 0.5, 1 i j 3. Find the probability P ( X 1 + X 2 + X 3 2). It is difficult to calculate this probability by the integration of the joint PDF. We provide an estimate by simulation. (The justification will be in Chapter 6). We solve a general problem first. Let X N (µ, Σ), X has p component. For any set A R p, we may estimate the probability P (X A) by the relative frequency #{1 i n : X i A} / n, where n is a large integer, and X 1,, X n are n vectors generated independently from N (µ, Σ).

21 Note X = µ + Σ 1/2 Z, where Z N (0, I p ) is standard normal, and Σ 1/2 0 and Σ 1/2 Σ 1/2 = Σ. We generate Z by rnorm(p), and apply the above linear transformation to obtain X. Σ 1/2 may be obtained by an eigenanalysis for Σ using R-function eigen. Since Σ 0, it holds that Σ = ΓΛΓ, where Γ is an orthogonal matrix (i.e. Γ Γ = I p ), Λ = diag(λ 1,, λ p ) is a diagonal matrix. Then Σ 1/2 = ΓΛ 1/2 Γ, where Λ 1/2 = diag ( ) λ1,, λ p. The R function rmnorm below generate random vectors from N (µ, Σ).

22 rmnorm <- function(n, p, mu, Sigma) { # generate n p-vectors from N(mu, Sigma) # mu is p-vector of mean, Sigma >=0 is pxp matrix t <- eigen(sigma, symmetric=t) # eigenanalysis for Sigma ev <- sqrt(t$values) # square-roots of the eigenvalues G <- as.matrix(t$vectors) # line up eigenvectors into a matrix G D <- G*0; for(i in 1:p) D[i,i] <- ev[i]; # D is diagonal matrix P <- G%*%D%*%t(G) # P=GDG' is the required transformation matrix Z <- matrix(rnorm(n*p), byrow=t, ncol=p) # Z is nxp matrix with elements drawn from N(0,1) Z <- Z%*%P # Now each row of Z is N(0, Sigma) X <- matrix(rep(mu, n), byrow=t, ncol=p) + Z # each row of X is N(mu, Sigma) }

23 This function is saved in the file rmnorm.r. We may use it to perform the required task: source("rmnorm.r") mu <- c(0, 0, 0) Sigma <- matrix(c(1,0.5,0.5,0.5,1,0.5,0.5,0.5,1), byrow=t, ncol=3) X <- rmnorm(20000, 3, mu, Sigma) dim(x) # check the size of X t <- abs(x[,1]) + abs(x[,2]) + abs(x[,3]) cat("estimated probability:", length(t[t<=2])/20000, "\n") It returned the value: Estimated probability: 0.446

24 I repeated it a few more times and obtained the estimates 0.439, 0.445, etc.

25 4.4 Transformations of random variables Let a random vector X have PDF f X. We are interested in the distribution of a scalar function of X, say, Y = r (X). We introduce a general procedure first. Three steps to find the PDF of Y = r (X): (i) For each y, find the set A y = {x : r (x) y } (ii) Find the CDF (iii) f Y (y ) = d dy F Y (y ). F Y (y ) = P (Y y ) = P {r (X) y } = A y f X (x)d x.

26 Example 6. Let X f X (x) (X is a scalar). Find the PDF of Y = e X. A y = {x : e x y } = {x : x log y }. Hence Hence F Y (y ) = P (Y y ) = P {e X y } = P (X log y ) = F X (log y ). f Y (y ) = d dy F X (log y ) = f X (log y ) d log y dy = y 1 f X (log y ). Note that y = e x and log y = x, dy dx = ex = y. The above result can be written as f Y (y ) = f X (x) / dy dx, or f Y (y )dy = f X (x)dx. For 1-1 transformation Y = r (X ) (i.e. the inverse function X = r 1 (Y ) is

27 uniquely defined), it holds that f Y (y ) = f X (x)/ r (x) = f X (x) dx. dy Note. You should replace all x in the above by x = r 1 (y ).

28 Example 7. Let X U ( 1, 3). Find the PDF of Y = X 2. Now this is not a 1-1 transformation. We have to use the general 3-step procedure. Note that Y takes values in (0, 9). Consider two cases: (i) For y (0, 1), A y = ( y, y ), F y (y ) = A y f X (x)dx = 0.5 y. Hence f Y (y ) = F Y (y ) = 0.25/ y. (ii) For y [1, 9), A y = ( 1, y ), F y (y ) = A y f X (x)dx = 0.25( y + 1). Hence f Y (y ) = F Y (y ) = 0.125/ y. Collectively we have f Y (y ) = 0.25/ y 0 < y < / y 1 y < 9 0 otherwise.

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type. Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,

More information

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ). .8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics

More information

8 - Continuous random vectors

8 - Continuous random vectors 8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Exam 2. Jeremy Morris. March 23, 2006

Exam 2. Jeremy Morris. March 23, 2006 Exam Jeremy Morris March 3, 006 4. Consider a bivariate normal population with µ 0, µ, σ, σ and ρ.5. a Write out the bivariate normal density. The multivariate normal density is defined by the following

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

HW4 : Bivariate Distributions (1) Solutions

HW4 : Bivariate Distributions (1) Solutions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Random Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30

Random Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30 Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices

More information

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

[POLS 8500] Review of Linear Algebra, Probability and Information Theory [POLS 8500] Review of Linear Algebra, Probability and Information Theory Professor Jason Anastasopoulos ljanastas@uga.edu January 12, 2017 For today... Basic linear algebra. Basic probability. Programming

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander

ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 PROBABILITY. Prof. Steven Waslander ME 597: AUTONOMOUS MOBILE ROBOTICS SECTION 2 Prof. Steven Waslander p(a): Probability that A is true 0 pa ( ) 1 p( True) 1, p( False) 0 p( A B) p( A) p( B) p( A B) A A B B 2 Discrete Random Variable X

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

Gaussian random variables inr n

Gaussian random variables inr n Gaussian vectors Lecture 5 Gaussian random variables inr n One-dimensional case One-dimensional Gaussian density with mean and standard deviation (called N, ): fx x exp. Proposition If X N,, then ax b

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

Introduction to Normal Distribution

Introduction to Normal Distribution Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction

More information

Chapter 5,6 Multiple RandomVariables

Chapter 5,6 Multiple RandomVariables Chapter 5,6 Multiple RandomVariables ENCS66 - Probabilityand Stochastic Processes Concordia University Vector RandomVariables A vector r.v. is a function where is the sample space of a random experiment.

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

16.584: Random Vectors

16.584: Random Vectors 1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Lecture 14: Multivariate mgf s and chf s

Lecture 14: Multivariate mgf s and chf s Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

L2: Review of probability and statistics

L2: Review of probability and statistics Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

4 Pairs of Random Variables

4 Pairs of Random Variables B.Sc./Cert./M.Sc. Qualif. - Statistical Theory 4 Pairs of Random Variables 4.1 Introduction In this section, we consider a pair of r.v. s X, Y on (Ω, F, P), i.e. X, Y : Ω R. More precisely, we define a

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 22-January-2009 WPI D. Richard Brown III 22-January-2009 1 / 37 Lecture 1 Major Topics 1. Web page. 2. Syllabus

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Lecture 15: Multivariate normal distributions

Lecture 15: Multivariate normal distributions Lecture 15: Multivariate normal distributions Normal distributions with singular covariance matrices Consider an n-dimensional X N(µ,Σ) with a positive definite Σ and a fixed k n matrix A that is not of

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Multivariate Distributions

Multivariate Distributions Copyright Cosma Rohilla Shalizi; do not distribute without permission updates at http://www.stat.cmu.edu/~cshalizi/adafaepov/ Appendix E Multivariate Distributions E.1 Review of Definitions Let s review

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information