Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Size: px
Start display at page:

Download "Lecture 4: Proofs for Expectation, Variance, and Covariance Formula"

Transcription

1 Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ / 28

2 Discrete Random Variables: X and Y Let X and Y be two discrete random variables. X takes n possible values: {x 1,..., x n } Y takes m possible values: {y 1,..., y m }. Hiro Kasahara (UBC) Econ / 28

3 Notations: The joint probability mass function is given by p X,Y ij = P (X = x i, Y = y j ), i = 1,... n; j = 1,..., m. The marginal probability mass function of X is p X i = P (X = x i ) = m j=1 p X,Y ij, i = 1,... n, and the marginal probability mass function of Y is p Y j = P (Y = y j ) = n p X,Y ij, j = 1,... m. Hiro Kasahara (UBC) Econ / 28

4 Notations: Table: Example of Joint Distribution of X and Y Y = y 1 Y = y 2 Y = y 3 Marg. prob. of X X = x 1 p X,Y 11 p X,Y 12 p X,Y 13 p1 X X = x 2 p X,Y 21 p X,Y 22 p X,Y 23 Marg. prob. of Y p1 Y p2 Y p3 Y p2 X 1.00 P (X = x 1 ) = 3 j=1 p X,Y 1j = p X,Y 11 + p X,Y 12 + p X,Y 13 = p X 1. Hiro Kasahara (UBC) Econ / 28

5 Notations: Table: Example of Joint Distribution of X and Y Y = 30 Y = 60 Y = 100 Marg. Prob. of X X = X = Marg. Prob. of Y P(X = 0) = = Hiro Kasahara (UBC) Econ / 28

6 Proposition If a and b are constants, then E(a + bx) = a + be(x). Proof: E(a + bx) n def = (a + bx i )pi X = (a + bx 1 )p X (a + bx n )p X n = (ap X ap X n ) + (bx 1 p X bx n p X n ) = a (p1 X pn X ) + b (x 1 p1 X x n pn X ) n n = a pi X + b x i pi X = a + be(x). Hiro Kasahara (UBC) Econ / 28

7 Clicker Question 4-1 For any constant a, b, and for any function g(x), which of the following is true? A). E[a + b g(x)] = a + b g(e[x]). B). E[a + b g(x)] = a + b E[g(X)]. C). E[a + b g(x)] = a + b g(e[x]) = a + b E[g(X)]. Hiro Kasahara (UBC) Econ / 28

8 Question 1 in Worksheet Prove that, for any any constant a, b, and for any function g(x), E[a + b g(x)] = a + b E[g(X)]. Hiro Kasahara (UBC) Econ / 28

9 Proposition Cov (a 1 + b 1 X, a 2 + b 2 Y ) = b 1 b 2 Cov (X, Y ), where a 1, a 2, b 1, and b 2 are some constants. Proof: Cov(X, Y ) def = E[(a 1 + b 1 X E(a 1 + b 1 X))(a 2 + b 2 Y E(a 2 + b 2 Y ))] = E[(a 1 a 1 + b 1 X b 1 E(X))(a 2 a 2 + b 2 Y b 2 E(Y )] = E[b 1 b 2 (X E(X))(Y E(Y ))] = n m j=1 = b 1 b 2 n b 1 b 2 (x i E(X))(y j E(Y )) p X,Y ij m j=1 = b 1 b 2 Cov(X, Y ). [x i E(X)][y j E(Y )] p X,Y ij Hiro Kasahara (UBC) Econ / 28

10 Clicker Question 4-2 For any constant a and b, which of the following is true? A). Var[a + bx] = a + bvar[x]. B). Var[a + bx] = bvar[x]. C). Var[a + bx] = b 2 Var[X]. Hiro Kasahara (UBC) Econ / 28

11 Question 2 in Worksheet Prove that, for any any constant a, b, Var[a + bx] = b 2 Var[X]. Hiro Kasahara (UBC) Econ / 28

12 Proposition If X and Y are independent, then Cov (X, Y ) = 0. Proof: Cov(X, Y ) def = E[(X E(X))(Y E(Y ))] n m = [x i E(X)][y j E(Y )] P(X = x i, Y = y j ) = = n n j=1 m [x i E(X)][y j E(Y )]pi X pj Y (by independence) j=1 j=1 m {[x i E(X)]pi X }{[y j E(Y )]pj Y } Hiro Kasahara (UBC) Econ / 28

13 Proof continued = = n m {[x i E(X)]pi X }{[y j E(Y )]pj Y } j=1 n [x i E(X)]pi X { n = x i pi X = { E(X) E(X) m [y j E(Y )]pj Y j=1 } n m E(X)pi X y j pj Y n p X i } j=1 m E(Y ) E(Y ) (by definition of E(X) and E(Y )) j=1 p Y j m E(Y )pj Y j=1 Hiro Kasahara (UBC) Econ / 28

14 Proof continued = { E(X) E(X) n p X i } m E(Y ) E(Y ) = {E(X) E(X) 1} {E(Y ) E(Y ) 1} = 0 0 = 0. j=1 p Y j Hiro Kasahara (UBC) Econ / 28

15 Clicker Question 4-3 Suppose that X and Y are independent. For any function g(x) and h(y), which of the following is true? A). Cov (g(x), h(y )) = 0 always holds. B). Cov (g(x), h(y )) = 0 does not hold in some cases. Hiro Kasahara (UBC) Econ / 28

16 Question 4 in Worksheet Prove that, when X and Y are independent, for any function g(x) and h(y). Cov (g(x), h(y )) = 0 Hiro Kasahara (UBC) Econ / 28

17 Proposition Var (X + Y ) = Var (X) + Var (Y ) + 2Cov (X, Y ). Proof: Let X def = X E(X) and Ỹ def = Y E(Y ). Var(X + Y ) def = E[(X + Y E(X + Y )) 2 ] = E[((X E(X))) + (Y E(Y ))) 2 ] = E[( X + Ỹ )2 ] = E[ X 2 + Ỹ XỸ ] = E[((X E(X)) 2 ] + E[(Y E(Y )) 2 ] + 2E[((X E(X))(Y E(Y ))] = Var(X) + Var(Y ) + 2Cov(X, Y ) by definition of variance and covariance Hiro Kasahara (UBC) Econ / 28

18 Clicker Question 4-4 For any function g(x) and h(y), which of the following is true? A). Var(g(X) + h(y )) = Var(g(X)) + Var(h(Y )) B). Var(g(X) + h(y )) = Var(g(X)) + Var(h(Y )) + 2Cov(g(X), h(y )) C). Var(g(X) + h(y )) = g(var(x)) + h(var(y )) Hiro Kasahara (UBC) Econ / 28

19 Question 5 in Worksheet Prove that, for any constant a and b, Var[aX + by ] = a 2 Var[X] + b 2 Var[Y ] + 2abCov[X, Y ]. Hiro Kasahara (UBC) Econ / 28

20 Bernoulli random variable The probability mass function: { 0 with probability p X = 1 with probability 1 p What is E[X] and Var[X]? Hiro Kasahara (UBC) Econ / 28

21 Proposition If X is a Bernoulli random variable with P(X = 1) = p, then E[X] = p. Proof: E(X) def = x=0,1 xp x (1 p) 1 x = (0)(1 p) + (1)(p) = p Hiro Kasahara (UBC) Econ / 28

22 Clicker Question 4-6 Suppose that X 1 and X 2 are two Bernoulli random variables that are stochastically independent with P(X 1 = 1) = P(X 2 = 1) = p. Which of the following is true? A). E[X 1 + X 2 ] = p. B). E[X 1 + X 2 ] = 2p. C). E[X 1 + X 2 ] = p/2. Hiro Kasahara (UBC) Econ / 28

23 Question 7 in Worksheet Prove that, if X and Y are two Bernoulli random variables that are stochastically independent with P(X = 1) = P(Y = 1) = p, then E[X + Y ] = 2p. Hiro Kasahara (UBC) Econ / 28

24 Proposition If X is a Bernoulli random variable with P(X = 1) = p, then V [X] = p(1 p). Proof: Var(X) def = x=0,1 (x p) 2 p x (1 p) 1 x = (0 p) 2 (1 p) + (1 p) 2 p = p 2 (1 p) + (1 p) 2 p = (p + (1 p)) p(1 p) = p(1 p). Hiro Kasahara (UBC) Econ / 28

25 Clicker Question 4-7 Suppose that X 1 and X 2 are two Bernoulli random variables that are stochastically independent with P(X i = 1) for i = 1, 2. Which of the following is true? A). Var[(X 1 + X 2 )/2] = p(1 p). B). Var[(X 1 + X 2 )/2] = p(1 p)/2. C). Var[(X 1 + X 2 )/2] = p(1 p)/4. Hiro Kasahara (UBC) Econ / 28

26 Question 9 in Worksheet Prove that, if X and Y are two Bernoulli random variables that are stochastically independent with P(X = 1) = P(Y = 1) = p, then [ ] X + Y p(1 p) Var =. 2 2 Hiro Kasahara (UBC) Econ / 28

27 Proposition Suppose that X i for i = 1,..., n are n Bernoulli random variables that are stochastically independent to each other with P(X i = 1) for i = 1,..., n. Let X := 1 n n X i. Then, (1) E[ X] = p and (2) Var[ X] = p(1 p) n. Proof for (1): E[ X] = 1 n E[X 1 + X X n ] = 1 n {E[X 1] + E[X 2 ] E[X n ]} = p + p p n = n p = p. n Hiro Kasahara (UBC) Econ / 28

28 Proof for (2) Var[ X] = p(1 p) Var[ X] = Var = [ 1 n ] n X i n : ( ) 2 1 Var[X 1 + X X n ] n = 1 n 2 {Var[X 1] + Var[X 2 ] Var[X n ] +2Cov(X 1, X 2 ) + 2Cov(X 1, X 3 ) Cov(X n 1, X n )} = 1 {p(1 p) + p(1 p) p(1 p) + 0} n2 n p(1 p) = = n 2 p(1 p). n Hiro Kasahara (UBC) Econ / 28

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Sta230/Mth230 Colin Rundel February 5, 2014 We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean Random

More information

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0. EXPECTED VALUE of a RV corresponds to the average value one would get for the RV when repeating the experiment, independently, infinitely many times. Sample (RIS) of n values of X (e.g. More accurately,

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Random Variables and Expectations

Random Variables and Expectations Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Expectation and Variance

Expectation and Variance Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Stat 5101 Notes: Algorithms (thru 2nd midterm) Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 7 Prof. Hanna Wallach wallach@cs.umass.edu February 14, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Regression and Covariance

Regression and Covariance Regression and Covariance James K. Peterson Department of Biological ciences and Department of Mathematical ciences Clemson University April 16, 2014 Outline A Review of Regression Regression and Covariance

More information

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

The mean, variance and covariance. (Chs 3.4.1, 3.4.2) 4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Lecture 10. Variance and standard deviation

Lecture 10. Variance and standard deviation 18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition

More information

CS70: Jean Walrand: Lecture 22.

CS70: Jean Walrand: Lecture 22. CS70: Jean Walrand: Lecture 22. Confidence Intervals; Linear Regression 1. Review 2. Confidence Intervals 3. Motivation for LR 4. History of LR 5. Linear Regression 6. Derivation 7. More examples Review:

More information

6 The normal distribution, the central limit theorem and random samples

6 The normal distribution, the central limit theorem and random samples 6 The normal distribution, the central limit theorem and random samples 6.1 The normal distribution We mentioned the normal (or Gaussian) distribution in Chapter 4. It has density f X (x) = 1 σ 1 2π e

More information

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

The Binomial distribution. Probability theory 2. Example. The Binomial distribution Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

ORF 245 Fundamentals of Statistics Great Expectations

ORF 245 Fundamentals of Statistics Great Expectations ORF 245 Fundamentals of Statistics Great Expectations Robert Vanderbei Fall 2015 Slides last edited on November 16, 2015 http://www.princeton.edu/ rvdb Definition The expectation of a random variable is

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 MA 575 Linear Models: Cedric E Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2 1 Revision: Probability Theory 11 Random Variables A real-valued random variable is

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Econ 325: Introduction to Empirical Economics

Econ 325: Introduction to Empirical Economics Eco 35: Itroductio to Empirical Ecoomics Lecture 3 Discrete Radom Variables ad Probability Distributios Copyright 010 Pearso Educatio, Ic. Publishig as Pretice Hall Ch. 4-1 4.1 Itroductio to Probability

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Properties of Random Variables

Properties of Random Variables Properties of Random Variables 1 Definitions A discrete random variable is defined by a probability distribution that lists each possible outcome and the probability of obtaining that outcome If the random

More information

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations Robert Vanderbei Fall 2014 Slides last edited on October 20, 2014 http://www.princeton.edu/ rvdb Definition The expectation of a random variable

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 8 Prof. Hanna Wallach wallach@cs.umass.edu February 16, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes? Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X

More information

Econ 325: Introduction to Empirical Economics

Econ 325: Introduction to Empirical Economics Econ 325: Introduction to Empirical Economics Lecture 6 Sampling and Sampling Distributions Ch. 6-1 Populations and Samples A Population is the set of all items or individuals of interest Examples: All

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for

More information

Functions of two random variables. Conditional pairs

Functions of two random variables. Conditional pairs Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one

More information

Final Exam. Economics 835: Econometrics. Fall 2010

Final Exam. Economics 835: Econometrics. Fall 2010 Final Exam Economics 835: Econometrics Fall 2010 Please answer the question I ask - no more and no less - and remember that the correct answer is often short and simple. 1 Some short questions a) For each

More information

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18 Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

Stochastic Simulation Introduction Bo Friis Nielsen

Stochastic Simulation Introduction Bo Friis Nielsen Stochastic Simulation Introduction Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: bfn@imm.dtu.dk Practicalities Notes will handed

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Econ 2120: Section 2

Econ 2120: Section 2 Econ 2120: Section 2 Part I - Linear Predictor Loose Ends Ashesh Rambachan Fall 2018 Outline Big Picture Matrix Version of the Linear Predictor and Least Squares Fit Linear Predictor Least Squares Omitted

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type. Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,

More information

2. Discrete Random Variables Part II: Expecta:on. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

2. Discrete Random Variables Part II: Expecta:on. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak 2. Discrete Random Variables Part II: Expecta:on ECE 302 Spring 2012 Purdue University, School of ECE Prof. Expected value of X: Defini:on Expected value of X: Defini:on E[X] is also called the mean of

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information