Mathematics of Finance Problem Set 1 Solutions

Similar documents
Class 8 Review Problems solutions, 18.05, Spring 2014

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Homework 4 Solution, due July 23

1 Basic continuous random variable problems

Lecture 34: Properties of the LSE

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Class 8 Review Problems 18.05, Spring 2014

Math 510 midterm 3 answers

Twelfth Problem Assignment

Properties of Random Variables

Econ 371 Problem Set #1 Answer Sheet

Discrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions

Section 9.1. Expected Values of Sums

18.600: Lecture 24 Covariance and some conditional expectation exercises

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Review of Probability. CS1538: Introduction to Simulations

No books, no notes, only SOA-approved calculators. Please put your answers in the spaces provided!

Lecture 10. Variance and standard deviation

1 Basic continuous random variable problems

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

Week 12-13: Discrete Probability

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

Random Variables and Expectations

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

X = X X n, + X 2

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering.

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Class 26: review for final exam 18.05, Spring 2014

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

18.440: Lecture 25 Covariance and some conditional expectation exercises

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Practice Exam 1: Long List 18.05, Spring 2014

4. Distributions of Functions of Random Variables

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Notes for Math 324, Part 19

Joint Distribution of Two or More Random Variables

Midterm Exam 1 (Solutions)

Intermediate Math Circles November 8, 2017 Probability II

Lecture 6: Linear models and Gauss-Markov theorem

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

STA301- Statistics and Probability Solved Subjective From Final term Papers. STA301- Statistics and Probability Final Term Examination - Spring 2012

Nonparametric hypothesis tests and permutation tests

18.440: Lecture 26 Conditional expectation

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Problems and results for the ninth week Mathematics A3 for Civil Engineering students

Topic 3 Random variables, expectation, and variance, II

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110.

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

Lecture 4. August 24, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6.

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Lecture 8 Sampling Theory

Theory of probability and mathematical statistics

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Chapter 4 continued. Chapter 4 sections

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Appendix A : Introduction to Probability and stochastic processes

Multivariate Random Variable

MULTIVARIATE PROBABILITY DISTRIBUTIONS

II. Introduction to probability, 2

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Senior Math Circles November 19, 2008 Probability II

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

1 Presessional Probability

Ch. 5 Joint Probability Distributions and Random Samples

STAT 418: Probability and Stochastic Processes

MA , Probability (Dr Chernov) Final Exam Tue, March 7, 2000 Student s name Be sure to show all your work. Each problem is 4 points.

Lectures on Statistics. William G. Faris

Stat 5101 Notes: Algorithms (thru 2nd midterm)

STAT 414: Introduction to Probability Theory

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Regression and Covariance

Set Theory Digression

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Probability and Statistics Notes

This does not cover everything on the final. Look at the posted practice problems for other topics.

University of Illinois ECE 313: Final Exam Fall 2014

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

success and failure independent from one trial to the next?

Review for Exam Spring 2018

Optimization and Simulation

ECE Homework Set 3

18.440: Lecture 19 Normal random variables

11. Regression and Least Squares

Monday, September 10 Handout: Random Processes, Probability, Random Variables, and Probability Distributions

Probability & Statistics - FALL 2008 FINAL EXAM

Variances and covariances

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

Lecture 2: Review of Probability

Transcription:

Mathematics of Finance Problem Set 1 Solutions 1. (Like Ross, 1.7) Two cards are randomly selected from a deck of 52 playing cards. What is the probability that they are both aces? What is the conditional probability that they are both aces, given that they are of different suits? ANSWER: (a) Number of two-ace ordered pairs: 4 3. Total number of 4 3 ordered pairs: 52 51. Probability: 52 51 1 221. (b) ( ) ( 4 3 ) 52 39 1 169. 2. Suppose 10 people toss their hats into a box and then take turns drawing hats at random, so each person ends up with one hat. How many possible permutations of the hats are there? Assume that each such permuation is equally likely. Let S i be one if the ith person gets her own hat, zero otherwise. (a) Calculate mean and variance of S i. (b) Calculate the covariance of S i and S j when i j. (c) Let S be the number of people who get their own hat. Calculate the mean and variance of S. ANSWER: (a) E(S i ) is the probability that i gets own hat, which is 1 Var(S i ) E(Si 2) [E(S i)] 2 1 10 1 100 9 100. (b) Covariance is E(S i S j ) E(S i )E(S j ). The former term is the probability that both i and j get their own hats. There are 8! permutations in which i and j get their own hats, and 10! total permutations. Hence E(S i S j ) 8! 10!. Thus, (c) Cov(S i,s j ) E(S i S j ) E(S i )E(S j ) 8! 10! 1 100 1 90 1 100 1 900. E(S) E(S 1 +S 2 +...+S 10 ) E(S 1 )+E(S 2 )+...+E(S 10 ) 10 1 10 1. Var(S) Cov(S,S) Cov(S 1 + S 2 +... + S 10,S 1 + S 2 +... + S 10 ). 10. 1

By the bilinearity of covariance, Cov(S,S) 10 10 j1 Cov(S i,s j ). There are 100 terms in this sum. There are 10 terms for which i j and 90 terms for which i j. The former terms are each equal to 9 100 by (a) and the latter are equal to 1 900 by (b). Hence, the total sum is 90 100 + 90 900 1 Var(S). 3. (Ross 1.17) If Cov(X i,x j ) ij, find (a) Cov(X 1 + X 2,X 3 + X 4 ) (b) Cov(X 1 + X 2 + X 3,X 2 + X 3 + X 4 ). ANSWER: use bilinearity of covariance to get Cov(X 1 + X 2,X 3 + X 4 ) Cov(X 1,X 3 + X 4 ) + Cov(X 2,X 3 + X 4 ) Cov(X 1,X 3 )+Cov(X 1,X 4 )+Cov(X 2,X 3 )+Cov(X 2,X 4 ) 3+4+6+8 21. By similar arguments, part b is 54. 4. Four dice are rolled. Let X i be the value of the ith roll. Let Y X 1 X 2 X 3 X 4 be the product of the rolls. Using the independence of the X i to avoid long calculations, compute the mean and variance of Y. ANSWER: Similarly, E(Y ) E(X 1 X 2 X 3 X 4 ) E(X 1 )E(X 2 X 3 X 4 ) E(X 1 )E(X 2 )E(X 3 X 4 ) E(X 1 )E(X 2 )E(X 3 )E(X 4 ) ( 7 2 )4. E(Y 2 ) E(X1 2 X2 2 X2 3 X2 4 ) ( E(X1 2 )E(X2 2 )E(X2 3 )E(X2 4 )) ( ) 1 + 4 + 9 + 16 + 25 + 36 4 ( ) 91 4. 6 6 Thus, Var(Y ) E(Y 2 ) E(Y ) 2 ( ) 91 4 6 ( 7 2 )8. 2

5. Three hundred people on an airplane are offered two dinner choices: grilled chicken and vegetarian lasagna. Suppose that everyone takes a dinner and each person independently chooses the chicken with probably.6 and the lasagna with probability.4. Let C be the number of people who choose chicken and L the number of people who choose lasagna. Calculate the following: E(C), E(L), Var(C), Var(L), Cov(C,L). Using the central limit theorem, estimate the amount of lasagna and chicken the airline should have on hand to ensure that there is at most a five percent chance that there will be an insufficient supply of either of the two items. ANSWER: Let C i be the function that is 1 if ith person gets chicken and zero otherwise. Let L i be function that is 1 if the ith person gets lasagna and zero otherwise. So L i + C i 1 for each i {1,2,...,300}. Then for each i, we have E(C i ).6 and E(L i ).4 so E(C) 180 and E(L) 120. Also, for each i, we have Var(C i ) E(Ci 2) E(C i) 2.6.26.24 and Var(L i ) E(L 2 i ) E(L i) 2.4.16.4. The variance of a sum of independent random variables is the sum of the variances. So Var(C) 300Var(C i ) 72 and Var(L) 300Var(L i ) 72. The standard deviation is 72 8.5. With probability about.95, we will have C within two standard deviations of its mean, i.e., 180 17 C 180 + 17. And since L 300 C, we have 120 17 L 120 + 17. So if we have 197 chicken meals and 137 lasagna meals, there is an approximately ninety-five percent probability that there will be no shortage of either one. (If you momentarily forget about lasagna and only ask for the probability to be ninety-five percent that there is no chicken shortage, then you only need about 194 chicken meals. I also gave credit for this answer.) 300 j1 Cov(C i,l j ). The terms in this sum are zero Also, Cov(C,L) 300 if i j, by independence. But when i j, we have E[C i,l i ] 0 and E[C i ]E[L i ].24. Thus Cov(C i,l j ).24. Summing up the 300 non-zero terms gives Cov(C, L) 72. Now, the correlation coefficient is 72/( 72 72) 1. That is, C and L are maximally negatively correlated. (This makes sense since C + L 300; so when C is high, L is low, and vice versa.) Cov(C,L) σ C σ L 6. (Ross 2.9) A model for the movement of a stock supposes that, if the present price of the stock is s, then after one time period it will be either us with probability p or ds with probability 1 p. Assuming that successive movements are independent, approximate the probability that the stock s price will be up at least 30 percent after the next 1000 time periods if u 1.012, d.990, and p.52. (Hint: take logarithms and use 3

the central limit theorem.) ANSWER: Let X i be equal to 1.012 if the stock goes up at the ith step and.990 if the stock goes down at the ith step. We are interested in knowing the probability that S(0)X 1 X 2 X 3...X 1000 is at least equal to 1.30S(0). This is the same as the probability that the product X 1 X 2 X 3...X 1000 is at least 1.30, which is in turn the same as the probability that log X 1 + X 2 +... + X 1000 log 1.3. For each 1 i 1000, write Y i log X i. Then each Y i is a random variable. We have E[Y i ].52log(1.012) +.48(log.990).0013787. Also E[Yi 2].52(log 1.012)2 +.48(log.990) 2.000122476. Thus Var(Y i ) E(Yi 2) E(Y i) 2.000120575. Write Y log X 1000 Y i. Then E(Y ) 1000E(Y i ) 1.3787, and Var(Y ) 1000Var(Y i ).120575. Thus, the standard deviation of Y is E(Y ) log 1.3 σ Y.347239. Now, log 1.3.262363. And σ Y 3.21488. In other words, log 1.3 is 3.21488 standard deviations of Y below the expected value of Y. The central limit theorem says that the probability that that Y is less than log 1.3 is approximately the same as the probability that a standard normal variable is less than 3.21488. Consulting a normal distribution table, we see that this probability is about.00065. The answer to the original question is thus about 1.00065. 7. (Ross 1.13) Let X 1,... X n be independent random variables, all having the same distribution with expected value µ and variance σ 2. The random variable X, defined as the arithmetic average of these variables, is called the sample mean. That is, the sample mean is given by n X i. n (a) Show that E[X] µ. (b) Show that Var[X] σ 2 /n. The random variable S 2, defined by S 2 n (X i X) 2, is the sample variance. (Denominator is, not n, due to (d).) (c) Show that n (X i X) 2 n X2 i nx2. 4

(d) Show that E[S 2 ] σ 2. ANSWER: (a) E[X] E[ X i n ] 1 n E[X i ] 1 n (nu) u (b) Var[X] E[X 2 ] E[X] 2 E[ Xi 2 E[ X X i ] 2 1 n 2Var[ X i ] 1 n 2(nσ2 ) σ2 n (c) (X i X) 2 [Xi 2 2X i X + X 2 ] Xi 2 2 X 2 i X i X + (2X)X + X2 Xi 2 2X 2 + X 2 Xi 2 X 2 X 2 (d) Using part (c), we have 5

[ n E[S 2 ] E (X i X) 2 ] [ n ] E X2 i nx2 Now, using previous parts, we see that for each i, σ 2 Var(Xi 2) E[X2 i ] E[X i] 2. Thus, E[Xi 2] σ2 + µ 2. Also, σ 2 n Var[X] E[X2 ] E[X] 2 E[X 2 ] µ 2. Thus, E[X 2 ] σ2 n + µ2. Plugging these values for E[X 2 ] and E[X 2 ] into the expression for E[S 2 ], we compute [ n E[S 2 ] E (X i X) 2 ] [ n ] E X2 i nx2 n (σ2 + µ 2 ) n(σ 2 /n + µ 2 ) nσ2 + nµ 2 σ 2 + nµ 2 σ 2 6