4. Distributions of Functions of Random Variables

Size: px
Start display at page:

Download "4. Distributions of Functions of Random Variables"

Transcription

1 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n R,..., g k : R n R Find the joint distribution of the k random variables Y 1 = g 1 (X 1,..., X n ),..., Y k = g k (X 1,... X n ) (i.e. find f Y1,...,Y k and F Y1,...,Y k ) 155

2 Example: Consider as given X 1,..., X n with f X1,...,X n Consider the functions g 1 (X 1,..., X n ) = X i and g 2 (X 1,..., X n ) = 1 n X i Find f Y1,Y 2 with Y 1 = n X i and Y 2 = 1 n n X i Remark: From the joint distribution f Y1,...,Y k we can derive the k marginal distributions f Y1,... f Yk (cf. Chapter 3, Slides 106, 107) 156

3 Aim of this chapter: Techniques for finding the (marginal) distribution(s) of (Y 1,..., Y k ) 157

4 4.1 Expectations of Functions of Random Variables Simplification: In a first step, we are not interested in the exact distributions, but merely in certain expected values of Y 1,..., Y k Expectation two ways: Consider as given the (continuous) random variables X 1,..., X n and the function g : R n R Consider the random variables Y = g(x 1,..., X n ) and find the expectation E[g(X 1,..., X n )] 158

5 Two ways of calculating E(Y ): or E(Y ) = E(Y ) = + y f Y (y) dy g(x 1,..., x n ) f X1,...,X n (x 1,... x n ) dx 1... dx n (cf. Definition 3.9, Slide 128) It can be proved that Both ways of calculating E(Y ) are equivalent choose the most convenient calculation 159

6 Now: Calculation rules for expected values, variances, covariances of sums of random variables Setting: X 1,..., X n are given continuous or discrete random variables with joint density f X1,...,X n The (transforming) function g : R n R is given by g(x 1,..., x n ) = x i 160

7 In a first step, find the expectation and the variance of Y = g(x 1,..., X n ) = X i Theorem 4.1: (Expectation and variance of a sum) For the given random variables X 1,..., X n we have and Var X i = E X i = Var(X i ) + 2 E(X i ) j=i+1 Cov(X i, X j ). 161

8 Implications: For given constants a 1,..., a n R we have E (why?) a i X i = a i E(X i ) For two random variables X 1 and X 2 we have E(X 1 ± X 2 ) = E(X 1 ) ± E(X 2 ) If X 1,..., X n are stochastically independent, it follows that Cov(X i, X j ) = 0 for all i = j and hence Var X i = Var(X i ) 162

9 Now: Calculating the covariance of two sums of random variables Theorem 4.2: (Covariance of two sums) Let X 1,..., X n and Y 1,..., Y m be two sets of random variables and let a 1,... a n and b 1,..., b m be two sets of constants. Then Cov a i X i, m j=1 b j Y j = m j=1 a i b j Cov(X i, Y j ). 163

10 Implications: Var The variance of a weighted sum of random variables is given by a i X i = Cov a i X i, j=1 a j X j = j=1 a i a j Cov(X i, X j ) = a 2 i Var(X i) + j=1,j =i a i a j Cov(X i, X j ) = a 2 i Var(X i) + 2 j=i+1 a i a j Cov(X i, X j ) 164

11 For two random variables X 1 and X 2 we have Var(X 1 ± X 2 ) = Var(X 1 ) + Var(X 2 ) ± 2 Cov(X 1, X 2 ), and if X 1 and X 2 are independent we have Var(X 1 ± X 2 ) = Var(X 1 ) + Var(X 2 ) Finally: Important result concerning the expectation of a product of two random variables 165

12 Setting: Let X 1, X 2 be both continuous or both discrete random variables with joint density f X1,X 2 Let g : R n R be defined as g(x 1, x 2 ) = x 1 x 2 Find the expectation of Y = g(x 1, X 2 ) = X 1 X 2 Theorem 4.3: (Expectation of a product) For the random variables X 1, X 2 we have E (X 1 X 2 ) = E(X 1 ) E(X 2 ) + Cov(X 1, X 2 ). 166

13 Implication: If X 1 and X 2 are stochastically independent, we have E (X 1 X 2 ) = E(X 1 ) E(X 2 ) Remarks: A formula for Var(X 1 X 2 ) also exists In many cases, there are no explicit formulas for expected values and variances of other transformations (e.g. for ratios of random variables) 167

14 4.2 The Cumulative-distribution-function Technique Motivation: Consider as given the random variables X 1,..., X n with joint density f X1,...,X n Find the joint distribution of Y 1,..., Y k where Y j = g j (X 1,..., X n ) for j = 1,..., k The joint cdf of Y 1,..., Y k is defined to be F Y1,...,Y k (y 1,..., y k ) = P (Y 1 y 1,..., Y k y k ) (cf. Definition 3.2, Slide 98) 168

15 Now, for each y 1,..., y k the event {Y 1 y 1,..., Y k y k } = {g 1 (X 1,..., X n ) y 1,..., g k (X 1,..., X n ) y k }, i.e. the latter event is an event described in terms of the given functions g 1,..., g k and the given random variables X 1,..., X n since the joint distribution of X 1,..., X n is assumed given, presumably the probability of the latter event can be calculated and consequently F Y1,...,Y k determined 169

16 Example 1: Consider n = 1 (i.e. consider X 1 X with cdf F X ) and k = 1 (i.e. g 1 g and Y 1 Y ) Consider the function Find the distribution of g(x) = a x + b, b R, a > 0 Y = g(x) = a X + b 170

17 The cdf of Y is given by F Y (y) = P (Y y) = P [g(x) y] = P (a X + b y) = P ( X y b a = F X ( y b a If X is continuous, the pdf of Y is given by ( ) f Y (y) = F Y (y) = F X y b = 1 a a f X (cf. Slide 48) ) ) ( y b a ) 171

18 Example 2: Consider n = 1 and k = 1 and the function g(x) = e x The cdf of Y = g(x) = e X is given by F Y (y) = P (Y y) = P (e X y) = P [X ln(y)] = F X [ln(y)] If X is continuous, the pdf of Y is given by f Y (y) = F Y (y) = F X [ln(y)] = f X [ln(y)] y 172

19 Now: Consider n = 2 and k = 2, i.e. consider X 1 and X 2 with joint density f X1,X 2 (x 1, x 2 ) Consider the functions g 1 (x 1, x 2 ) = x 1 + x 2 and g 2 (x 1, x 2 ) = x 1 x 2 Find the distributions of the sum and the difference of two random variables Derivation via the two-dimensional cdf-technique 173

20 Theorem 4.4: (Distribution of a sum / difference) Let X 1 and X 2 be two continuous random variables with joint pdf f X1,X 2 (x 1, x 2 ). Then the pdfs of Y 1 = X 1 + X 2 and Y 2 = X 1 X 2 are given by f Y1 (y 1 ) = + f X 1,X 2 (x 1, y 1 x 1 ) dx 1 and f Y2 (y 2 ) = = = + + f X 1,X 2 (y 1 x 2, x 2 ) dx 2 f X 1,X 2 (x 1, x 1 y 2 ) dx 1 + f X 1,X 2 (y 2 + x 2, x 2 ) dx

21 Implication: If X 1 and X 2 are independent, then f Y1 (y 1 ) = + f X 1 (x 1 ) f X2 (y 1 x 1 ) dx 1 f Y2 (y 2 ) = + f X 1 (x 1 ) f X2 (x 1 y 2 ) dx 1 Example: Let X 1 and X 2 be independent random variables both with pdf { 1, for x [0, 1] f X1 (x) = f X2 (x) = 0, elsewise Find the pdf of Y = X 1 + X 2 (Class) 175

22 Now: Analogous results for the product and the ratio of two random variables Theorem 4.5: (Distribution of a product / ratio) Let X 1 and X 2 be continuous random variables with joint pdf f X1,X 2 (x 1, x 2 ). Then the pdfs of Y 1 = X 1 X 2 and Y 2 = X 1 /X 2 are given by and f Y1 (y 1 ) = f Y2 (y 2 ) = x 1 f X 1,X 2 (x 1, y 1 x 1 ) dx 1 x 2 f X1,X 2 (y 2 x 2, x 2 ) dx

23 4.3 The Moment-generating-function Technique Motivation: Consider as given the random variables X 1,..., X n with joint pdf f X1,...,X n Again, find the joint distribution of Y 1,..., Y k where Y j = g j (X 1,..., X n ) for j = 1,..., k 177

24 According to Definition 3.14, Slide 143, the joint moment generating function of the Y 1,..., Y k is defined to be m Y1,...,Y k (t 1,..., t k ) = E [ e t 1 Y t k Y k ] = et 1 g 1 (x 1,...,x n )+...+t k g k (x 1,...,x n ) f X1,...,X n (x 1,..., x n ) dx 1... dx n If m Y1,...,Y k (t 1,..., t k ) can be recognized as the joint moment generating function of some known joint distribution, it will follow that Y 1,..., Y k has that joint distribution by virtue of the identification property (cf. Slide 145) 178

25 Example: Consider n = 1 and k = 1 where the random variable X 1 X has a standard normal distribution Consider the function g 1 (x) g(x) = x 2 Find the distribution of Y = g(x) = X 2 The moment generating function of Y is given by m Y (t) = E [ e t Y ] = E [e t X2] = + et x2 f X (x)dx 179

26 = + et x2 1 2π e 1 2 x2 dx =... = t 1 2 for t < 1 2 This is the moment generating function of a gamma distribution with parameters λ = 1 2 and r = 1 2 (see Mood, Graybill, Boes (1974), pp. 540/541) Y = X 2 Γ(0.5, 0.5) 180

27 Now: Distribution of sums of independent random variables Preliminaries: Consider the moment generating function of such a sum Let X 1,..., X n be independent random variables and let Y = n X i The moment generating function of Y is given by m Y (t) = E [ e t Y ] = E [ e t n X i ] = E [ e t X 1 e t X 2... e t X n ] = E [ e t X ] [ ] [ ] 1 E e t X 2... E e t X n = m X1 (t) m X2 (t)... m Xn (t) [Theorem 3.13(c)] 181

28 Theorem 4.6: (Moment generating function of a sum) Let X 1,..., X n be stochastically independent random variables with existing moment generating functions m X1 (t),..., m Xn (t) for all t ( h, h), h > 0. Then the moment generating function of the sum Y = n X i is given by m Y (t) = n m Xi (t) for t ( h, h). Hopefully: The distribution of the sum Y = n X i may be identified from the moment generating function of the sum m Y (t) 182

29 Example 1: Assume that X 1,..., X n are independent and identically distributed exponential random variables with parameter λ > 0 The moment generating function of each X i (i = 1,..., n) is given by m Xi (t) = λ λ t for t < λ (cf. Mood, Graybill, Boes (1974), pp. 540/541) So the moment generating function of the sum Y = n X i is given by m Y (t) = m X i (t) = n m Xi (t) = ( λ λ t ) n 183

30 This is the moment generating function of a Γ(n, λ) distribution (cf. Mood, Graybill, Boes (1974), pp. 540/541) the sum of n independent, identically distributed exponential random variables with parameter λ has a Γ(n, λ) distribution 184

31 Example 2: Assume that X 1,..., X n are independent random variables and that X i N(µ i, σ 2 i ) Furthermore, let a 1,..., a n R be constants Then the distribution of the weighted sum is given by Y = (Proof: Class) a i X i N a i µ i, a 2 i σ2 i 185

32 4.4 General Transformations Up to now: Techniques that allow us, under special circumstances, to find the distributions of the transformed variables Y 1 = g 1 (X 1,..., X n ),..., Y k = g k (X 1,..., X n ) However: These methods do not necessarily hit the mark (e.g. if calculations get too complicated) 186

33 Resort: There are constructive methods by which it is generally possible (under rather mild conditions) to find the distributions of transformed random variables transformation theorems Here: We restrict attention to the simplest case where n = 1, k = 1, i.e. we consider the transformation Y = g(x) For multivariate extensions (i.e. for n 1, k 1) see Mood, Graybill, Boes (1974), pp

34 Theorem 4.7: (Transformation theorem for densities) Suppose X is a continuous random variable with pdf f X (x). Set D = {x : f X (x) > 0}. Furthermore, assume that (a) the transformation g : D W with y = g(x) is a one-to-one transformation of D onto W, (b) the derivative with respect to y of the inverse function g 1 : W D with x = g 1 (y) is continuous and nonzero for all y W. Then Y = g(x) is a continuous random variable with pdf f Y (y) = dg 1 (y) dy f X ( g 1 (y) ), for y W 0, elsewise. 188

35 Remark: The transformation g : D W with y = g(x) is called oneto-one, if for every y W there exists exactly one x D with y = g(x) Example: Suppose X has the pdf { θ x θ 1, for x [1, + ) f X (x) = 0, elsewise (Pareto distribution with parameter θ > 0) Find the distribution of Y = ln(x) We have D = [1, + ), g(x) = ln(x), W = [0, + ) 189

36 Furthermore, g(x) = ln(x) is a one-to-one transformation of D = [1, + ) onto W = [0, + ) with inverse function x = g 1 (y) = e y Its derivative with respect to y is given by dg 1 (y) dy = e y, i.e. the derivative is continuous and nonzero for all y [0, + ) Hence, the pdf of Y = ln(x) is given by f Y (y) = { ey θ (e y ) θ 1, for y [0, + ) 0, elsewise = { θ e θ y, for y [0, + ) 0, elsewise 190

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Slides. Advanced Statistics

Slides. Advanced Statistics Slides Advanced Statistics Summer Term 2011 (April 5, 2011 May 17, 2011) Tuesdays, 14.15 15.45 and 16.00 17.30 Room: J 498 Prof. Dr. Bernd Wilfling Westfälische Wilhelms-Universität Münster Contents 1

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Statistics 3657 : Moment Approximations

Statistics 3657 : Moment Approximations Statistics 3657 : Moment Approximations Preliminaries Suppose that we have a r.v. and that we wish to calculate the expectation of g) for some function g. Of course we could calculate it as Eg)) by the

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Expectation and Variance

Expectation and Variance Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

Introduction to Statistical Inference Self-study

Introduction to Statistical Inference Self-study Introduction to Statistical Inference Self-study Contents Definition, sample space The fundamental object in probability is a nonempty sample space Ω. An event is a subset A Ω. Definition, σ-algebra A

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

HW4 : Bivariate Distributions (1) Solutions

HW4 : Bivariate Distributions (1) Solutions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Structural Reliability

Structural Reliability Structural Reliability Thuong Van DANG May 28, 2018 1 / 41 2 / 41 Introduction to Structural Reliability Concept of Limit State and Reliability Review of Probability Theory First Order Second Moment Method

More information

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016 Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)

Delta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0) Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying

More information

Review of Probability Theory II

Review of Probability Theory II Review of Probability Theory II January 9-3, 008 Exectation If the samle sace Ω = {ω, ω,...} is countable and g is a real-valued function, then we define the exected value or the exectation of a function

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Notes on Random Vectors and Multivariate Normal

Notes on Random Vectors and Multivariate Normal MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. 3. General Random Variables Part IV: Mul8ple Random Variables ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof. Ilya Pollak Joint PDF of two con8nuous r.v. s PDF of continuous r.v.'s

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

HW Solution 12 Due: Dec 2, 9:19 AM

HW Solution 12 Due: Dec 2, 9:19 AM ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Advanced topics from statistics

Advanced topics from statistics Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution

More information

Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE

Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE Applied Econometrics - QEM Theme 1: Introduction to Econometrics Chapter 1 + Probability Primer + Appendix B in PoE Warsaw School of Economics Outline 1. Introduction to econometrics 2. Denition of econometrics

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get 18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Contents 1. Contents

Contents 1. Contents Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................

More information

Consequences of measurement error. Psychology 588: Covariance structure and factor models

Consequences of measurement error. Psychology 588: Covariance structure and factor models Consequences of measurement error Psychology 588: Covariance structure and factor models Scaling indeterminacy of latent variables Scale of a latent variable is arbitrary and determined by a convention

More information