Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Similar documents
Properties of Summation Operator

Probability Theory and Statistics. Peter Jochumzen

MATH Notebook 4 Fall 2018/2019

More than one variable

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

2 (Statistics) Random variables

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

ENGG2430A-Homework 2

Multivariate Random Variable

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Business Statistics 41000: Homework # 2 Solutions

Class 8 Review Problems solutions, 18.05, Spring 2014

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Jointly Distributed Random Variables

Bivariate distributions

STOR Lecture 16. Properties of Expectation - I

ECON Fundamentals of Probability

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Lecture 2: Review of Probability

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Math 510 midterm 3 answers

Problem 1. Problem 2. Problem 3. Problem 4

Lecture 2: Repetition of probability theory and statistics

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 22: Variance and Covariance

LIST OF FORMULAS FOR STK1100 AND STK1110

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Covariance and Correlation

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Elements of Probability Theory

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Joint Distribution of Two or More Random Variables

Review of Probability Theory

Random Variables and Their Distributions

STAT 430/510: Lecture 16

STAT Chapter 5 Continuous Distributions

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Algorithms for Uncertainty Quantification

Expectation of Random Variables

Lecture 11. Probability Theory: an Overveiw

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Chapter 5 continued. Chapter 5 sections

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Random Variables and Expectations

Ch. 5 Joint Probability Distributions and Random Samples

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

ACM 116: Lectures 3 4

Chapter 4 continued. Chapter 4 sections

More on Distribution Function

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

STAT/MATH 395 PROBABILITY II

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

3 Multiple Discrete Random Variables

Stat 5101 Notes: Algorithms

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Joint Probability Distributions, Correlations

Multiple Random Variables

1 Expectation of a continuously distributed random variable

HW Solution 12 Due: Dec 2, 9:19 AM

SDS 321: Introduction to Probability and Statistics

Chp 4. Expectation and Variance

Homework 10 (due December 2, 2009)

ECSE B Solutions to Assignment 8 Fall 2008

ECON 5350 Class Notes Review of Probability and Distribution Theory

Random variables (discrete)

1 Random Variable: Topics

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Probability. Hosung Sohn

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Homework 5 Solutions

3. Probability and Statistics

Notes 12 Autumn 2005

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

This does not cover everything on the final. Look at the posted practice problems for other topics.

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Bivariate Distributions

Final Review: Problem Solving Strategies for Stat 430

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

Continuous Probability Distributions

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

STA 256: Statistics and Probability I

Week 12-13: Discrete Probability

Functions of two random variables. Conditional pairs

Continuous Random Variables

, 0 x < 2. a. Find the probability that the text is checked out for more than half an hour but less than an hour. = (1/2)2

Probability and Distributions

Recitation 2: Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Lecture 1: August 28

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Transcription:

Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable with support { 1, 0, 1} where p( 1) = q and p(1) = p. (a) What is p(0)? By the complement rule p(0) = 1 p q. (b) Calculate the CDF, F (x 0 ), of X. F (x 0 ) = 0, x 0 < 1 q, 1 x 0 < 0 1 p, 0 x 0 < 1 1, x 0 1 (c) Calculate E[X]. E[X] = 1 q + 0 (1 p q) + p 1 = p q (d) What relationship must hold between p and q to ensure E[X] = 0? p = q 2. Fill in the missing details from class to calculate the variance of a Bernoulli Random Variable directly, that is without using the shortcut formula. 1

σ 2 = V ar(x) = (x µ) 2 p(x) = x {0,1} x {0,1} (x p) 2 p(x) = (0 p) 2 (1 p) + (1 p) 2 p = p 2 (1 p) + (1 p) 2 p = p 2 p 3 + p 2p 2 + p 3 = p p 2 = p(1 p) 3. Prove that the Bernoulli Random Variable is a special case of the Binomial Random variable for which n = 1. (Hint: compare pmfs.) The pmf for a Binomial(n, p) random variable is ( ) n p(x) = p x (1 p) n x x with support {0, 1, 2..., n}. Setting n = 1 gives, ( ) 1 p(x) = p(x) = p x (1 p) 1 x x with support {0, 1}. Plugging in each realization in the support, and recalling that 0! = 1, we have 1! p(0) = 0!(1 0)! p0 (1 p) 1 0 = 1 p and p(1) = 1! 1!(1 1)! p1 (1 p) 0 = p which is exactly how we defined the Bernoulli Random Variable. 4. Suppose that X is a random variable with support {1, 2} and Y is a random variable with support {0, 1} where X and Y have the following joint distribution: p XY (1, 0) = 0.20, p XY (1, 1) = 0.30 p XY (2, 0) = 0.25, p XY (2, 1) = 0.25 Page 2

(a) Express the joint distribution in a 2 2 table. Y X 1 2 0 0.20 0.25 1 0.30 0.25 (b) Using the table, calculate the marginal probability distributions of X and Y. p X (1) = p XY (1, 0) + p XY (1, 1) = 0.20 + 0.30 = 0.50 p X (2) = p XY (2, 0) + p XY (2, 1) = 0.25 + 0.25 = 0.50 p Y (0) = p XY (1, 0) + p XY (2, 0) = 0.20 + 0.25 = 0.45 p Y (1) = p XY (1, 1) + p XY (2, 1) = 0.30 + 0.25 = 0.55 (c) Calculate the conditional probability distribution of Y X = 1 and Y X = 2. The distribution of Y X = 1 is P (Y = 0 X = 1) = p XY (1, 0) p X (1) P (Y = 1 X = 1) = p XY (1, 1) p X (1) = 0.2 0.5 = 0.4 = 0.3 0.5 = 0.6 while the distribution of Y X = 2 is P (Y = 0 X = 2) = p XY (2, 0) p X (2) P (Y = 1 X = 2) = p XY (2, 1) p X (2) = 0.25 0.5 = 0.5 = 0.25 0.5 = 0.5 (d) Calculate E[Y X]. Page 3

Hence, E[Y X = 1] = 0 0.4 + 1 0.6 = 0.6 E[Y X = 2] = 0 0.5 + 1 0.5 = 0.5 E[Y X] = since p X (1) = 0.5 and p X (2) = 0.5. (e) What is E[E[Y X]]? { 0.6 with probability 0.5 0.5 with probability 0.5 E[E[Y X]] = 0.5 0.6 + 0.5 0.5 = 0.3 + 0.25 = 0.55. Note that this equals the expectation of Y calculated from its marginal distribution, since E[Y ] = 0 0.45 + 1 0.55. This illustrates the so-called Law of Iterated Expectations. (f) Calculate the covariance between X and Y using the shortcut formula. First, from the marginal distributions, E[X] = 1 0.5 + 2 0.5 = 1.5 and E[Y ] = 0 0.45 + 1 0.55 = 0.55. Hence E[X]E[Y ] = 1.5 0.55 = 0.825. Second, E[XY ] = (0 1) 0.2 + (0 2) 0.25 + (1 1) 0.3 + (1 2)0.25 = 0.3 + 0.5 = 0.8 Finally Cov(X, Y ) = E[XY ] E[X]E[Y ] = 0.8 0.825 = 0.025 5. Let X and Y be discrete random variables and a, b, c, d be constants. Prove the following: (a) Cov(a + bx, c + dy ) = bdcov(x, Y ) Let µ X = E[X] and µ Y = E[Y ]. By the linearity of expectation, E[a + bx] = a + bµ X E[c + dy ] = c + dµ Y Thus, we have (a + bx) E[a + bx] = b(x µ X ) (c + dy) E[c + dy ] = d(y µ Y ) Page 4

Substituting these into the formula for the covariance between two discrete random variables, Cov(a + bx, c + dy ) = [b(x µ X )] [d(y µ Y )] p(x, y) x y = bd (x µ X )(y µ Y )p(x, y) x y = bdcov(x, Y ) (b) Corr(a + bx, c + dy ) = Corr(X, Y ) provided that b, c are positive. Corr(a + bx, c + dy ) = = = Cov(a + bx, c + dy ) V ar(a + bx)v ar(c + dy ) bdcov(x, Y ) b2 V ar(x)d 2 V ar(y ) Cov(X, Y ) V ar(x)v ar(y ) = Corr(X, Y ) 6. Fill in the missing steps from lecture to prove the shortcut formula for covariance: Cov(X, Y ) = E[XY ] E[X]E[Y ] By the Linearity of Expectation, Cov(X, Y ) = E[(X µ X )(Y µ Y )] = E[XY µ X Y µ Y X + µ X µ Y ] = E[XY ] µ x E[Y ] µ Y E[X] + µ X µ Y = E[XY ] µ X µ Y µ Y µ X + µ X µ Y = E[XY ] µ X µ Y = E[XY ] E[X]E[Y ] Page 5

7. Let X 1 be a random variable denoting the returns of stock 1, and X 2 be a random variable denoting the returns of stock 2. Accordingly let µ 1 = E[X 1 ], µ 2 = E[X 2 ], σ 2 1 = V ar(x 1 ), σ 2 2 = V ar(x 2 ) and ρ = Corr(X 1, X 2 ). A portfolio, Π, is a linear combination of X 1 and X 2 with weights that sum to one, that is Π(ω) = ωx 1 + (1 ω)x 2, indicating the proportions of stock 1 and stock 2 that an investor holds. In this example, we require ω [0, 1], so that negative weights are not allowed. (This rules out short-selling.) (a) Calculate E[Π(ω)] in terms of ω, µ 1 and µ 2. E[Π(ω)] = E[ωX 1 + (1 ω)x 2 ] = ωe[x 1 ] + (1 ω)e[x 2 ] = ωµ 1 + (1 ω)µ 2 (b) If ω [0, 1] is it possible to have E[Π(ω)] > µ 1 and E[Π(ω)] > µ 2? What about E[Π(ω)] < µ 1 and E[Π(ω)] < µ 2? Explain. No. If short-selling is disallowed, the portfolio expected return must be between µ 1 and µ 2. (c) Express Cov(X 1, X 2 ) in terms of ρ and σ 1, σ 2. Cov(X, Y ) = ρσ 1 σ 2 (d) What is V ar[π(ω)]? (Your answer should be in terms of ρ, σ1 2 and σ2.) 2 V ar[π(ω)] = V ar[ωx 1 + (1 ω)x 2 ] = ω 2 V ar(x 1 ) + (1 ω) 2 V ar(x 2 ) + 2ω(1 ω)cov(x 1, X 2 ) = ω 2 σ1 2 + (1 ω) 2 σ2 2 + 2ω(1 ω)ρσ 1 σ 2 (e) Using part (d) show that the value of ω that minimizes V ar[π(ω)] is ω = σ 2 2 ρσ 1 σ 2 σ 2 1 + σ 2 2 2ρσ 1 σ 2 In other words, Π(ω ) is the minimum variance portfolio. Page 6

The First Order Condition is: 2ωσ1 2 2(1 ω)σ2 2 + (2 4ω)ρσ 1 σ 2 = 0 Dividing both sides by two and rearranging: ωσ1 2 (1 ω)σ2 2 + (1 2ω)ρσ 1 σ 2 = 0 ωσ1 2 σ2 2 + ωσ2 2 + ρσ 1 σ 2 2ωρσ 1 σ 2 = 0 ω(σ1 2 + σ2 2 2ρσ 1 σ 2 ) = σ2 2 ρσ 1 σ 2 So we have ω = σ 2 2 ρσ 1 σ 2 σ 2 1 + σ 2 2 2ρσ 1 σ 2 (f) If you want a challenge, check the second order condition from part (e). The second derivative is 2σ 2 1 2σ 2 2 4ρσ 1 σ 2 and, since ρ = 1 is the largest possible value for ρ, 2σ 2 1 2σ 2 2 4ρσ 1 σ 2 2σ 2 1 2σ 2 2 4σ 1 σ 2 = 2(σ 1 σ 2 ) 2 0 so the second derivative is positive, indicating a minimum. minimum since the problem is quadratic in ω. This is a global Page 7