MAS223 Statistical Inference and Modelling Exercises and Solutions

Size: px
Start display at page:

Download "MAS223 Statistical Inference and Modelling Exercises and Solutions"

Transcription

1 MAS3 Statistical Inference and Modelling Exercises and Solutions The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions, ordinary questions, and challenge questions Note that there are no exercises accompanying Chapter 8 The vast majority of exercises are ordinary questions Ordinary questions will be used in homeworks and tutorials; they cover the material content of the course Warm-up questions are typically easier, often nothing more than revision of relevant material from first year courses Challenge questions are typically harder and test ingenuity This version of the exercises also contains solutions, which are written in blue Solutions to challenge questions are not always included, hints may be given instead Some of the solutions mention common pitfalls, written in red, which are mistakes that are sometimes easily made The solutions sometimes omit intermediate steps of basic calculations, which are left to the reader For example, they may simply state x λe λu e λu, and leave you to fill in the intermediate steps Contents Univariate Distribution Theory Standard Univariate Distributions 3 Transformations of Univariate Random Variables 5 4 Multivariate Distribution Theory 5 5 Transformations of Multivariate Distributions 34 6 Covariance Matrices and and Multivariate Normal Distributions 43 7 Likelihood and Maximum Likelihood 5

2 Univariate Distribution Theory Warm-up Questions Let X be a random variable taking values in,, 3}, with P[X ] P[X ] 4 Find P[X 3], and calculate both E[X] and Var[X] Since P[X ] + P[X ] + P[X 3], we have P[X 3] With this, we can calculate E[X] P[X ] + P[X ] + 3P[X + 3] 8 E[X ] P[X ] + P[X ] + 3 P[X + 3] 38 Using that VarX E[X ] E[X], we have VarX 56 Let Y be a random variable with probability density function pdf fy given by y/ for y < ; fy otherwise Find the probability that Y is between and Calculate E[Y ] and Var[Y ] We have P[Y [, ]] / y/ dy 3/6 Similarly, E[Y ] E[Y ] so VarY E[Y ] E[Y ] /9 yfy y 3 / dy yy/ dy 4/3 Ordinary Questions 3 Define F : R [, ] by for y ; F y y for y, ; for y a Sketch the function F, and check that it is a distribution function b If Y is a random variable with distribution function F, calculate the pdf of Y a Sketch of F should look like

3 From the graph, F is continuous and non-strictly increasing We have F x for all x, so lim x F x Similarly, F x for all x, so lim x F x Hence, F satisfies all the properties of a distribution function b We have fy F y, so treating each case in turn, for y ; fy y for y, ; for y 4 Let X be a discrete random variable, taking values in,, }, where P[X n] 3 for n,, } Sketch the distribution function F X : R R Sketch should look like Pitfall: The graph of F is not continuous; it jumps at,,, and is otherwise constant 5 Define f : R [, ] by fx for x < ; e x for x a Show that f is a probability density function b Find the corresponding distribution function and evaluate P[ < X < ] a Clearly fx for all x, and fx dx so f is a probability density function e x dx [ e x], 3

4 b We need to calculate F x P[X x] x fu du For x we have F x x dx For x, we have F x du + x e u, du + e x Thus, for x ; F x e x for x Hence, P[ < X < ] P[X < ] P[X ] P[X ] P[X ] F F e e 6 Sketch graphs of each of the following two functions, and explain why each of them is not a distribution function for x ; a F x x for x > for x < ; b F x x + sin πx for x < ; 4 for x Sketches should look like For a, F x > for x >, so F does not stay between and For b, for x [, we have F x fx + π 4 cosπx, which is negative at, for example, x, so as is clear from the graph F is not an increasing function 7 Let k R and define f : R R by kx x for x,, fx otherwise Find the value of k for which fx is a probability density function, and calculate the probability that X is greater than We need fx for all x, so we need k Also, we need [ ] x fx dx k x x dx k x3 k 3 6 So, k 6 Therefore, P[X ] fx dx [ ] x 6x x dx 6 x3 3 4

5 8 The probability density function fx is given by + x for x ; fx x for < x < ; otherwise Find the corresponding distribution function F x for all real x We have F P[X ] P [X < ] So F x for all x If x [, ] then F x x Now, for x [, ], we have fu du F + x + u du + x + F x x fu du F + x u du + x x + x x Pitfall: Forgetting the F results in missing out the term It needs to be present because for x, we have F x P[X x] P[X ] + P[ < X < x] F + x u du Note that in the case of x [, ] the equivalent term was F and was equal to Therefore, we have F Since F is increasing and must stay between and, we have F x for all x Thus the distribution function F x is, for x < x+ F x, for x < +x x, for x <, for x 9 Let F x ex + e x for all real x a Show that F is a distribution function, and find the corresponding pdf f b Show that f x fx c If X is a random variable with this distribution, evaluate P[ X > ] a Since e x as x, we have lim x lim x e x + e x + e x + e x lim x 5 e x + +

6 Pitfall: It does not make sense to use that lim x e x and then incorrectly claim that + Using the quotient rule, the derivative of f, the corresponding pdf, is fx ex + e x e x e x + e x e x + e x Therefore, fx > for all x R, so F is an increasing function Since x e x is continuous, F is a composition of sums and non-zero divisions of continuous functions; therefore F is continuous Hence, F satisfies all the properties of a distribution function b f x c We have e x ex e x ex fx +e x e x +e x e x + P[ X > ] P[X < ] + P[X > ] P[X ] + P[X ] + F F + e +e e +e 38 Note that here we used P[X < ] P[X ], which holds because F is continuous Show that fx π, defined for all x R, is a probability density function +x Clearly fx for all x, and f X x dx π + x dx [ ] arctanx π π π π a For which values of r [, is x r dx finite? b Show that x if x > fx otherwise is a probability density function c Show that the expectation of a random variable X, with the probability density function f given in b, is not defined d Give an example of a random variable Y for which E[Y ] < but E[Y ] is not defined a For r we have x r dx lim n which is finite and equal to we have r n x dx lim n x r n r+ dx lim n r + r + if r + < and infinite if r + > When r n Hence, x r dx is finite if and only if r > 6 x dx lim n log n

7 b Clearly fx for all x and so f is a probability density function x dx [ x ] x c we have xf Xx x dx, which by part a is infinite Hence the expectation of X is not defined d Let Then fy for all y, and fy fy dy y 3 if x > otherwise y 3 dy [ y ] y so f is a probability density function From a we have that is infinite and that y f Y ydy yf Y ydy y dy y dy 3 is finite Hence, if Y is a random variable with pdf f, then E[Y ] is defined but E[Y ] is not The discrete random variable X has the probability function P[X x] a Use the partial fractions of xx+ xx+ for x N otherwise to show that P[X x], for all x N x+ b Write down the distribution function F x of X, for x R Sketch its graph What are the values of F and F 3? c Evaluate P[ X ] d Is E[X] defined? If so, what is E[X]? If not, why not? a Using partial fractions we obtain the identity xx+ x Hence, if x N, P[X x] x P[X x] x+, provided x, x x + x + 7

8 b The distribution function is F u x+, for u [x, x + where x N, for u < which looks like F 3 3 and F 3 F Pitfall: The graph of F is not continuous The formula obtained in part a is only valid for x N, and not for all x R Since X is a discrete random variable, its distribution function jumps at the points where X takes values ie at x N and is constant in between those points c Since X is discrete, P[ X ] P[X ] P[X 9] F F 9 Pitfall: X is discrete, and P[X ] > So it s F F 9, and not F F d Since X is discrete, E[X] is defined if and only if x xp[x x] converges In our case, this sum is equal to x xx + x + which diverges x To see that the sum diverges, we can write it as x and note that each bracketed term is at least ; of course x Challenge Questions 3 Show that there is no random variable X, with range N, such that P[X n] is constant for all n N Since X has range N we have P[X N] If P[X n] P[X ] for all n then we would have P[X N] P[X n] P[X ] n 8 n

9 If P[X ] then P[X N], which is impossible, but similarly if P[X ] > then the sum is equal to +, which is not possible either Hence, no such random variable exists 4 Recall the meaning of inscribing a cube within a sphere: the cube sits inside of the sphere, with each vertex of the cube positioned on the surface of the sphere It is not especially easy to illustrate this on a two dimensional page, but here is an attempt: Suppose that ten percent of the surface of the sphere is coloured blue, and the rest of the surface is coloured red Show that, regardless of which parts are coloured blue, it is always possible to inscribe a cube within the sphere in such a way as all vertices of the cube are red Hint: A cube has eight corners Suppose that position of the cube is sampled uniformly from the set of possible positions What is the expected number of corners that are red? Let X be a cube inscribed within the sphere, orientated uniformly at random Strictly speaking, in three-dimensional spherical coordinates r, θ, φ this means we let the polar angle θ and azimuth angle φ be independent uniform random variables on, π More importantly, it means that the location of a given vertex of the cube is distributed uniformly on the surface of the sphere Let X be the number of corners that are red Label the corners from i,, 8 We can write 8 X Ai red}, where A i is the colour of the i th corner Here Ai red} is equal to if A i is red and equal to zero if A i is blue Hence, 8 E[X] E [ ] 8 Ai red} P[A i red] Since A i is uniformly distributed on the surface of the sphere, and 9% of the sphere is red, we have P[A i red] 9 Hence, E[X] Note that X can only take the values,,, 8} Since E[X] > 7, we must have P[X 8] > Therefore, there are orientations of the cube for which all 8 vertices are red 9

10 Standard Univariate Distributions Warm-up Questions a A standard fair dice is rolled 5 times Let X be the number of sixes rolled Which distribution and which parameters would you use to model X? b A fair coin is flipped until the first head is shown Let X be the total number of flips, including the final flip on which the first head appears Which distribution and which parameter would you use to model X? a The binomial distribution, with parameters n 5 and p 6 b The geometric distribution, with parameter p Ordinary Questions Let λ > Write down the pdf f of the random variable X, where X Expλ, and calculate its distribution function F Hence, show that is constant for t > The pdf of X is fx λe λx for x > ; otherwise ft F t It s distribution function F x x fu du is clearly zero for x, and for x > we have x fu du x λe λu du e λx Therefore, e λx for x > ; F x otherwise ft F t λe λt Hence, for t > we have λ e λt Pitfall: Don t forget the otherwise case where fx or F x The same comment applies to many other questions 3 Let λ > and let X be a random variable with Expλ distribution Let Z X, that is let Z be X rounded down to the nearest integer Show that Z is geometrically distributed with parameter p e λ Since X >, we have Z,,, }, hence P[Z z] for all other z For n,,, } we have P[Z n] P[n X < n + ] n+ n λe λx dx e λn e λn+ e λn e λ p n p

11 which is the probability function of the geometric distribution 4 Let µ R Let X and X be independent random variables with distributions Nµ, and Nµ, 4, respectively Let T, T and T 3 be defined by T X + X, T X X, T 3 4X + X 5 Find the mean and variance of T, T and T 3 Which of E[T ], E[T ] and E[T 3 ] would you prefer to use as an estimator of µ? We have E[T ] E[X ] + E[X ] E[T ] µ Similarly, E[T ] E[T 3 ] µ, so all are unbiased when used as estimators of µ We have VarT VarX + CovX, X + VarX , and similarly VarT 8, VarT On this information, we prefer E[T 3 ] as an estimator of µ, because T 3 has the smallest variance and so is likely to be closest to its mean 5 Let X be a random variable with Gaα, β distribution a Let k N Show that E[X k ] αα + α + k β k Hence, calculate µ E[X] and σ VarX and verify that these formulas match the ones given in lectures [ b Show that E X µ 3 ] σ α a From the pdf of the Gamma distribution, we have E[X k ] x k βα βα Γα xα e βx dx x k+α e βx dx Γα βα Γk + α Γα β k+α Γk + α β k Γα k + α k + α k + α k + k + α kγα β k Γα αα + α + k β k To deduce the second line from the third line, we use Lemma 3 from lecture notes, and to deduce the fifth line from the fourth line we use Lemma, also from lecture notes

12 Pitfall: It is true that Γn n! for n N, but if n / N then n! does not make sense For general α, we have Γα α Γα, which is what must be be used to deduce the fifth line above If k then E[X] a b If k then E[X ] αα+ β VarX E[X ] E[X] αα+ β α β α β and so the variance of X is b With µ E[X] and σ VarX, multiplying out and using the formulae from a, we have E[X µ 3 ] σ 3 E[X3 3X µ + 3Xµ µ 3 ] σ 3 E[X3 ] 3E[X]E[X ] + E[X] 3 αα+α+ β 3 σ 3 3α α+ β 3 σ 3 + α3 β 3 αα + α + α + 3α 3α + α /β 3 α 3 /β 3 α 6 a Using R, you can obtain a plot of, for example, the pdf of a Ga3, random variable between and with the command curvedgammax,shape3,scale,from,to Use R to investigate how the shape of the pdf of a Gamma distribution varies with the different parameter values In particular, fix a value of β, see how the shape changes as you vary α b Investigate the effect that changing parameters values has on the shape of the pdf of the Beta distribution To produce, for example, a plot of the pdf of Be4, 5, use curvedbetax,shape4,shape5,from-,to a You should discover that decreasing α makes the pdf appear more skewed to the right This makes it more likely that a sample of the random variable has a large value b You should discover that the parameter shape which we normally denote by α controls the behaviour near x, and shape that is, β, controls the behaviour near x In both case, the parameters can be tuned to cause slow or fast explosion to, convergence to, and slow or fast convergence towards 7 Suggest which standard discrete distributions or combination of them we should use to model the following situations

13 a Organisms, independently, possess a given characteristic with probability p A sample of k organisms with the characteristic is required How many organisms will need to be tested to achieve this sample? b In Texas Hold em Poker, players make the best hand they can by combining two cards in their hand with five community cards that are placed face up on the table At the start of the game, a player can only see their own hand The community cards are then turned over, one by one A player has two hearts in her hand Three of the community cards have been turned over, and only one of them is a heart How many hearts will appear in the remaining two community cards? Use a computer to find the probability of seeing k,, hearts a We ll need to sample, with success probability p, until we achieve k successes So we will need to test N NegBink, p organisms to find our sample b There are a total of 5 cards, 3 of each of the four suits Our player can see 5 cards, 3 of which are hearts Therefore, the unknown cards consist of 47 cards, of which are hearts The number of hearts that will be drawn in the next two community cards is, therefore, a hypergeometric distribution with parameters N 47 population size, k successes, n trials As a result use eg R, P[X ] 65, P[X ] 3 and P[X ] 3 8 Let X be a N, random variable Use integration by parts to show that E[X n+ ] n + E[X n ] for any n,,, Hence, show that E[X n if n is odd; ] 35 n if n is even Integrating by parts, for any n, gives E[X n ] x n π e x / dx [ x n+ / π n + e x + x n+ ] x n+ xe x / dx n + π e x / dx n + π n + E[Xn+ ] Rearranging slightly, E[X n+ ] n + E[X n ] Since E[X], induction gives that E[X n ] for all odd n Since E[X ], induction gives that E[X n ] 35 n for all even n 9 Let X Nµ, σ Show that E[e X σ µ+ ] e 3

14 Using the scaling properties of normal random variables from, we write X µ + Y where Y N, σ Hence, e X e µ+y e µ e Y and E[e Y ] πσ πσ πσ e σ e σ πσ e y e y σ dy y σ + σ y } exp dy exp y + σ σ σ 4 } dy exp y + σ } σ dy Here, we use the same method as in Example 5: in the third line we complete the square and to deduce the final line we use that the pdf of a N σ, σ random variable integrates to Therefore, E[e X ] e µ E[e Y σ µ+ ] e Challenge Questions Let X be a random variable with a continuous distribution, and a strictly increasing distribution function F Show that F X has a uniform distribution on, Suggest how we might use this result to simulate samples from standard distributions Since F is strictly increasing, it has an inverse function F For x,, we have P[F X x] P[X F x] F F x x Hence, F X has the uniform distribution on, We write this as U F X, where U is uniform on, Therefore, F U X Consequently, if we can simulate uniform random variables, and calculate F x for given x, we can simulate X as F U In fact, this is a very common way of simulating random variables Recall that a distribution function F is not necessarily strictly increasing, but it is necessarily non-strictly increasing With some care, it is possible to extend this result to cover the general case For many standard distributions, F can be computed explicitly Prove that Γ π Hint Thanks to the normal distribution, you know that e x / dx π 4

15 3 Transformations of Univariate Random Variables Warm-up Questions 3 Let g : R R be given by gx x 3 + a Sketch the graph of g, show that g is strictly increasing, and find its inverse function g b Let R [, ] Find gr a A sketch of the function g looks like It is clear from the graph that g is strictly increasing If we set y x 3 + then x y /3, so the inverse function is g y y /3 b We have g and g 3 + 9, so gr [, 9] Ordinary Questions 3 Let X be a random variable with pdf x for x > f X x otherwise Define gx e x and let Y gx a Show that gx is strictly increasing Find its inverse function g y, and dg y dy b Identify the set R X on which f X x > Sketch g and show that gr X e, c Deduce from a and b that Y has pdf log y for y > e y f Y y otherwise 5

16 a We have dgx dx e x >, so g is strictly increasing If we have y e x then x log y, so the inverse function is g y log y Hence dg y dy y b f X x is non-zero for x R X, A sketch of g looks like and hence gr X e, c Since g is strictly increasing, we can use the formula fx g y dg y dy if y gr X f Y y otherwise, log y y for y > e otherwise 33 Let X be a random variable with the uniform distribution on,, and let Y log X λ where λ > Show that Y has an Expλ distribution We have f X x for x, and f X x otherwise Hence R X, Our transformation is gx log x λ For x > we have dg dx λx <, so g is strictly decreasing A sketch of g looks like from which we can see that gr X, Writing y log x λ, we have x e λy so g y e λy and dg dy λe λy Hence, we can apply Lemma 3 to find f Y y Thus, fx g y dg y dy for y gr X f Y y otherwise 6

17 Hence, using parts a,c and d we obtain λe λy for y, ; f Y y otherwise This is the pdf of the Expλ distribution Hence, Y Expλ Pitfall: Don t forget to comment that g is strictly decreasing or increasing, or else it isn t clear that you ve checked whether or not Lemma 3 applies 34 Let α, β > a Show that Bα, β Bβ, α b Let X be a random variable with the Beα, β distribution Show that Y X has the Beβ, α distribution a We have Bα, β ΓαΓβ Γα+β ΓβΓα Γα+β b We aim to use Lemma 3 We have Bβ, α f X x Define g :,, by Bα,β xα x β for x, ; otherwise gx x and then Y gx Note that g is strictly increasing on, We have g y y and dg dy A sketch of g looks like from which we can see that g,, Hence, from Lemma 3 we have f Y y f X g y dg dy Bα, β yα y β for y,, giving f Y y This is the pdf of the Betaβ, α distribution Bβ,α yβ y α for y, ; otherwise 7

18 35 Let α > a Show that Bα, α b Let X Beα, distribution Let Y r X for some positive integer r Show that Y also has a Beta distribution, and find its parameters a From Lemma we have Γα + αγα, hence Bα, ΓαΓ Γα + α b We aim to use Lemma 3 From a we have, Bα, f X x xα for x,, otherwise, αx α for x,, otherwise Hence, R X, The function gx r x is strictly increasing on, and gr X, We have g y y r and d dy g y ry r Hence, by Lemma 3, for y, we have Thus, f Y y αy rα ry r rαy rα f Y y rαy rα if y, otherwise which is the pdf of a Berα, distribution parameters rα and So Y has a Beta distribution with 36 Let X have a uniform distribution on [, ] Find the pdf of X and identify the distribution of X If x < then P[ X ] And since X [, ], if x > we have P[ X x] For x [, ] we have P[ X x] P[ x X x] 8 x x f X u du x x du x

19 Differentiating, we have Therefore, X is uniform on [, ] f X x for x [, ]; otherwise Pitfall: If we try the standard method of Lemma 3, we ll have gx x, which is not monotone on [, ]; so the formula f Y y f X g y dg dy does not apply here and it will give the wrong answer The same issue applies to several other questions in this section 37 Let α, β > and let X Beα, β Let c > and set Y c/x Find the pdf of Y We aim to use Lemma 3 We have X Beα, β and Y c/x, so set gx c/x where g :, R Therefore, g is strictly decreasing on, Then g y c dg y, and dy Further, y f X x > for x,, R X, and gr X c, Hence, for y > c we have f Y y For y c, we have f Y y Bα, β c α c β c y y y cα y c β Bα, β y α+β c 38 Let Θ be an angle chosen according to a uniform distribution on π, π, and let X tan Θ Show that X has the Cauchy distribution We aim to use Lemma 3 with X Θ The random variable Θ has a U π, π distribution, so f Θ θ π for π < θ < π ; otherwise The function gθ tan θ is strictly increasing on the range of Θ, that is on π, π 9

20 Moreover, g y arctany and g y π, π for all y R Hence, for all y R, f Y y f Θ g y Hence, Y has the Cauchy distribution dg dy π + y 39 Let X be a random variable with the pdf + x for < x < ; fx x for < x < ; otherwise Find the probability density functions of a Y 5X + 3 b Z X a The function gx 5x + 3 is strictly increasing, so we can use the formula f Y y f X g y d dy g y on each of the intervals in the definition of f X We note g y y 3 5 and d dy g y 5, so + y for < y 3 5 < f Y y y for < y 3 5 < otherwise y+ 5 for < y < 3 8 y 5 for 3 < y < 8 otherwise

21 b The function gx x is not monotonic, so instead we use that for z F Z z P[Z z] P[ z X z] z + u du + z u du for z for z > z z for z for z > If z < then P[ X < z] So the pdf of Z is f Z z d z z < dz F Zz otherwise 3 Let X have the uniform distribution on [a, b] a For [a, b] [, ], find the pdf of Y X b For [a, b] [, ], find the pdf of Y X a The probability density function of X is f X x for x [, ], otherwise We have Y, so f Y y for y For y, ] we have P[Y y] P[ X y] P[ y X y] y y dy y Thus, P[Y ] and hence P[Y y] for all y Differentiating, we obtain for y, f Y y y / for y, ], for y > b The probability density function of X is f X x 3 for x [, ], otherwise We have Y, so f Y y for y For y >, we need to consider three cases; y, ] and y, ] and y > For y [, ], we have P[Y y] P[ y X y] y y 3 dy y 3

22 For y, ], we have P[Y y] P[Y ] + P[ < X y] 3 + y 3 dy 3 + y 3 For y <, we note that P[Y ] from the previous case, so P[Y y] for all y > Differentiating, we obtain f Y y 3 for y [, ], 3 for y, ] for y or y >, 3 Let X have a uniform distribution on [, ] and define g : R R by for x ; gx x for x > Find the distribution function of gx Clearly Y gx, so P[Y y] for all y < We have For y, ] we have P[Y ] P[X ] dx P[Y y] P[Y ] + P[ < Y y] + y dy which means that P[Y ] and hence P[Y y] for all y To sum up, for y <, for y, F Y y y+ for y, ], for y > y +, 3 Let X be a random variable with the Cauchy distribution Show that X also has the Cauchy distribution The random variable X has pdf f X x Define g : R \ } R \ } π+x by gx /x

23 Note that P[X ], so the distribution of Y /X is still well defined Y gx, but g is not strictly monotone for example, g < g > g We have We plan to show that P[Y y] P[X y] for all y R, which means that X and Y have the same distribution function; hence the same distribution Let us look at the case y first There is no value of x R such that gx, so P[Y ] Since X is a continuous distribution, P[X ] Also X < if and only if Y <, so we have P[X ] P[X < ] P[Y < ] P[Y ] For y <, we have [ ] P[Y y] P y X < y y y π + x dx π + /v v dv π + v dv P[X y] Here, we substitute x /v For y >, we must split into two cases, giving Again, we substitute x /v P[ X y] P[X < ] + P[ y X] P[X < ] + P[X < ] + P[X < ] + y y y π + x dx π + /v v dv π + v dv P[X < ] + P[ X y] P[X y] Hence, P[X y] P[Y y] for all y R Since X has a Cauchy distribution, so does Y Challenge Questions 33 If we were to pretend that gx /x was strictly monotone, we could incorrectly apply Lemma 3 and use the formula f Y y f X g y dg to solve 3 We dy would still arrive at the correct answer Can you explain why? Can you construct another example of a case in which the relationship f Y y f X g y dg holds, but where the function g is not monotone? dy Hint Carefully examine the proof of the formula for f Y y f X g y dg dy when g is strictly monotone, and compare it to the solution of 3 In general though, if g is not strictly monotone, the formula will not work! 34 Let Y and α, β be as in Question 37 3

24 a If α >, show that E[Y ] cα+β α b If α show that E[Y ] is not defined a We use that E[gX] gxf Xx dx So [ c ] E X c x f Xx dx c x Bα, β xα x β dx c x α x β dx Bα, β cbα, β, Bα, β Expanding the Beta functions here in terms of the Gamma function, and using that Γα α Γα, we get c Γα Γβ Γα+β ΓαΓβ Γα+β c Γα Γβ Γα+β α Γα Γβ α+β Γα+β cα + β α b If we try to calculate E[Y ] then, using the pdf obtained in 37b we want c α Bα, β c y c β y α dy 3 We need to show that the integral does not converge The idea is to note that y c y, for large y, which means that y c y β ; leaving us with c y α dy which diverges to To implement this idea, we could begin by noting that 3 cα Bα, β c+ Since y > c +, we have c+ y c y β, ] then y c y β Hence, 3 cα Bα, β min c + β, y y c β y α dy 3 y If β then y c y β c+ β, and if y α dy c+ Since c+ y α dy diverges for α, we have that E[Y ] does not exist for such α 4

25 4 Multivariate Distribution Theory Warm-up questions 4 Let T x, y : < x < y} Define f : R R by e x y for x, y T ; fx, y otherwise Sketch the region T Calculate that they are equal See 4 for a sketch of T We have y x fx, y dx dy fx, y dx dy and fx, y dy dx, and verify y y x e y e x dx dy e y e y + [ e y + ] 3 e 3y y dy e y [ 3 e x ] y x e y e 3y dy 3 dy and x y which are equal fx, y dy dx x yx e y e x dx dy e x e x dx e 3x dx e x [ e y] yx dx [ ] 3 e 3x x 3, Pitfall: We must be careful to get the limits on the integrals correct For dy dx, we first allow x to vary between, which means that to cover T we must allow y to vary between x and We think of covering T by vertical lines, one for each x, each line with constant x and with y ranging from x up to It s helpful to draw a picture Alternatively, for dx dy, we first allow y to vary between and, which means that to cover T we must allow x to vary between and y We think of covering T by horizontal lines, one for each y, each line with constant y and with x ranging from up to y 4 Sketch the following regions of R a S x, y : x [, ], y [, ]} b T x, y : < x < y} c U x, y : x [, ], y [, ], y > x} 5

26 Ordinary Questions 43 Let X, Y be a random vector with joint probability density function ke x+y if < y < x f X,Y x, y otherwise a Using that P[X, Y R ], find the value of k b For each of the regions S, T, U in 4, calculate the probability that X, Y is inside the given region c Find the marginal pdf of Y, and hence identify the distribution of Y a Since P[X, Y R ], we have f X,Y x, y dy dx Therefore, k x e x+y dy dx k Hence, k We could also calculate k Pitfall: [ e x+y] x y dx k e x e x dx k y e x+y dx dy, with the same result We must be careful to get limits of the inner integral correct Allowing x to vary from, and then allowing y to vary from y x, draws out precisely the range of x, y that make up x, y : < y < x} It s very helpful to draw a sketch of the region you re trying to integrate over: See 4 for details of a similar case The same issue applies to part b of this question, and to many others 6

27 b We have P [X, Y S] x e x+y dy dx e x e x dx e + e + Since f X,Y x, y for all x, y T, we have P[X, Y T ] Lastly, P [X, Y U] x x/ e x+y dy dx e x + e 3x/ dx e 4 3 e 3/ + 3 In each case, we could instead calculate the dx dy integral, with appropriate limits c We integrate x out, giving, for y >, f Y y f X,Y x, y dx y e x+y dx [ e x+y] xy e y So Y has an Exponential distribution with parameter λ or, equivalently, a Gamma distribution with parameters α and β 44 Let S [, ] [, ], and let U and V have joint probability density function 4u+v u, v S; 3 f U,V u, v otherwise a Find P[U + V ] b Find P[V U ] a We need to integrate the joint pdf over the subset of u, v where u + v Note that the joint pdf is only non-zero when u and v So, we need to integrate over u, v : u, v u} and we get P [U + V ] 3 u 4u + v 3 dv du 3 + u 3u du 3 4u u + u du b We need to integrate the joint pdf over the subset of u, v where v u Note that the joint pdf is only non-zero when u [, ] and v [, ] So, we need to integrate over u, v : u and v u } and we get P [ V U ] u 4u + v 3 dv du 3 4u 3 + u 4 du 5 7

28 45 For the random variables U and V in Exercise 44: a Find the marginal pdf f U u of U b Find the marginal pdf f V v of V c For v such that f V v >, find the conditional pdf f U V v u of U given V v d Check that each of f U, f V and f U V v integrate over R to e Calculate the two forms of conditional expectation, E[U V v], and E[U V ] a We must integrate v out of f U,V u, v For u [, ] this gives f U u f U,V u, v dv 4u + v 3 dv 4u + 3 For u / [, ] we have f U,V u, v so also f U u Hence, the pdf of U is 4u+ f U u 3 for u [, ]; otherwise b Now, we integrate u out For v [, ] we have f V v f U,V u, v du 4u + v 3 du v + 3 For v / [, ] we have f U,V u, v so also f V v Hence, the pdf of V is v+ f V v 3 for v [, ]; otherwise c When f V v >, we have f U V v u f U,V u,v f V v, which means that for v [, ] we have u+v f U V v u v+ for u [, ]; otherwise d We check that finally that e For v [, ] we have 4u+ 3 du [ u+v +v du u +vu +v gv E[U V v] Therefore, E[U V ] gv +V [ ] u +u 3 ] u + v, and that as required uf U V v u du [ u V 3 + vu ] u v+ 3 dv u + uv du + v + v 3 + v [ ] v +v 3 and 8

29 46 Let X, Y be a random vector with joint probability density function y x x [, ], y [, ]; x+y f X,Y x, y x [, ], y [, ]; otherwise Find the marginal pdf of X Show that the correlation coefficient X and Y is zero Show also that X and Y are not independent The marginal pdf of X is given by f X x f X,Y x, y dy If x [, ] then this is y x dy x 4, and if x [, ] it is x+y dy +x 4 ; otherwise it is zero This gives E[X], because f X x f X x To find E[XY ] we calculate xyf X,Y x, y dx dy y x x y x y + y x dx dy + y dy y dy 6 y 4 y 6 dx dy Hence, the covariance is E[XY ] E[X]E[Y ] note that we do not need to know the value of E[Y ] Hence the correlation coefficient is also zero To show dependence, we will show that the joint pdf of X and Y does not factorise into the marginal densities For example, dividing f X,Y x, y by f X x gives y+x if x [, ] and y x x if x [, ], which does not depend only on y, so cannot be equal to f Y y Hence, X and Y are not independent 47 Let X be a random variable Let Z be a random variable, independent of X, such that P[Z ] P[Z ] Let Y XZ a Show that X and Y are uncorrelated b Give an example in which X and Y are not independent a We have CovX, Y E[XY ] E[X]E[Y ] E[X Z] E[X]E[XZ] E[X ]E[Z] E[X]E[X]E[Z] Note that to deduce the second line we use the independence of X and Z, and to deduce the third line we use that E[Z] + b Let P[X ] P[X ], which means that P[Y ] P[Y ] and P[Y ] We have +x P[X, Y ] P[X, Z ] P[X ]P[Z ] P[X ]P[Y ]

30 Hence X and Y are not independent In fact, any example in which X is not equal to a constant will result in X and Y not being independent Challenge question: prove this Pitfall: If X and Y are independent then CovX, Y However, there are many examples, such as this one, of cases in which CovX, Y where X and Y are not independent 48 Let X and Y be independent random variables, with < VarX VarY < Let U X + Y and V XY Show that U and V are uncorrelated if and only if E[X] + E[Y ] Recall that U and V are uncorrelated if and only if CovU, V Using that X and Y are independent, we note that CovU, V E[UV ] E[U]E[V ] E[X + Y XY ] E[XY ]E[X + Y ] E[X Y + Y X] E[X]E[Y ]E[X] + E[Y ] E[X ]E[Y ] + E[Y ]E[X] E[X] E[Y ] E[X]E[Y ] E[Y ] E[X ] E[X] + E[X] E[Y ] E[Y ] E[Y ] VarX + E[X] VarY E[Y ] + E[X] VarX The last line follows because VarX VarY Hence, U and V are uncorrelated if and only if E[X] E[Y ] 49 Let λ > Let X have an Expλ distribution, and conditionally given X x let U have a uniform distribution on [, x] Calculate E[U] and VarU Recall that the uniform distribution on, x has mean x x and variance We have E[U] E[E[U X]] and, since U is uniformly distributed on, X we have E[U X] X/ Hence, E[U] E[X/] λ Similarly, VarU X is equal to the variance of a uniform distribution on, X, which from the sheet of distributions, or by a simple calculation is X Hence, [ ] X X VarU E[VarU X] + VarE[U X] E + Var λ + 4 λ 5 λ 4 Let k R and let X, Y have joint probability density function kx sinxy for x,, y, π, f X,Y x, y otherwise a Find the value of k b For x,, find the conditional probability density function of Y given X x c Find E[Y X] 3

31 a We have P[X, Y R ], so Hence, k k k k k π x sinxy dy dx [ cosxy] π y dx cosπx dx [ x π sinπx ] x k b First, we need the marginal density of X For y, π we have Hence, f X x f X,Y x, y dy f X x Therefore, for x,, we have π x sinxy dy π [ cosxy]π y cosπx cosπx for x,, otherwise f Y Xx y f X,Y x, y f X x x sinxy cosπx for y, π, otherwise c Therefore, for x, we have E[Y X x] yf Y Xx y dy x π y sinxy dy cosπx To calculate the above, we use integration by parts to show that π [ y sinxy dy y ] π π x cosxy + cosxy dy x y π x cosπx + [ x sinxy ] π y π x cosπx + x sinπx Hence, E[Y X x] π cosπx x cosπx sinπx which means that E[Y X] cosπx π cosπx + X sinπx 4 Let U have a uniform distribution on,, and conditionally given U u let X have a uniform distribution on, u a Find the joint pdf of X, U and the marginal pdf of X 3

32 b Show that E[U X x] x log x a We have f X U x u /u for x, u, otherwise, f U u for u,, otherwise The joint pdf f X,U x, u satisfies f X,U x u f X U x,u f U u when f U u >, hence /u for u,, x, u, f X,U x, u otherwise /u for < x < u <, otherwise Hence, the marginal pdf of X is f X x x /u du for x,, otherwise, log x for x,, otherwise b To calculate E[U X ], we need the conditional pdf of U given X x, which is f U X u x f X,Ux, u f X x u log x for u x, otherwise, defined for x, Hence, E [U X x] uf U X u x x u u logx du x log x 4 Let X, Y have a bivariate distribution with joint pdf f X,Y x, y Let y R be such that f Y y > Show that f X Y y x is a probability density function We must check that f X Y y is non-negative and integrates to Since f X,Y x, y and f Y y >, we have that f X Y y x f X,Y x,y f Y y f X,Y x, y f X Y y x dx dx f Y y f X,Y x, y dx f Y y f Y y f Y y, Further, as required Here we use the definition of the marginal distribution f Y y f X,Y x, y dx 3

33 Challenge Questions 43 Give an example of random variables X, Y, Z such that P[X < Y ] P[Y < Z] P[Z < X] 3 Let X be uniformly distributed on the three element set,, } Set Y X + mod 3, and Z Y + mod 3 Of course, then X Z + mod 3 The real puzzle, of course, is how to dream up this example or another that does the same job This question is related to the observation that, in an election between three candidates A, B, C, it is possible for more than half of the voters to prefer A to B, for more than half to prefer B to C, and for more than half to prefer C to A 33

34 5 Transformations of Multivariate Distributions Warm-up Questions 5 a Define u x and v y Sketch the images of the regions S, T and U from question 4 in the u, v plane b Define u x + y and v x y Sketch the images of the regions S, T and U from question 4 in the u, v plane a b Ordinary Questions 5 The random variables X and Y have joint pdf given by xe y if x,, y, f X,Y x, y otherwise Define u ux, y x + y and v vx, y y Let U ux, Y and V vx, Y a Find the inverse transformation x xu, v and y yu, v and calculate the value of J det x/ u x/ v y/ u y/ v b Sketch the set of x, y for which f X,Y x, y is non-zero Find the image of this set in the u, v plane 34

35 c Deduce from a and b that the joint pdf of U, V is u v f U,V u, v e v/ for v >, u v, v + 4 otherwise a The inverse transformation is y v and x u y u v Hence, J det b f X,Y x, y is non-zero when x, and y, Call this set R The set R is bounded by the three lines x, x and y In the u, v plane these lines map respectively to v u, v u 4 and v The point x, y, R maps to u, v,, so R maps to the shaded region S in the u, v plane: We can describe S as the set of u, v such that v > and v u 4, u c Using a and b, we have f U,V u, v f X,Y u v, v J for u, v S otherwise u v e v/ for v >, v u 4, u otherwise 53 The random variables X and Y have joint pdf given by f X,Y x, y x + ye x+y for x, y elsewhere Let U X + Y and W X a Find the joint pdf of U, W and the marginal pdf of U b Recognize U as a standard distribution and, using the result of Question 5a, evaluate E[X + Y 5 ] 35

36 a Define U X + Y and W X so that Y U X U W The set H x, y : x, y } is bounded by the lines x and y, which respectively map to w and u w The point, is mapped to,, so H is mapped to H u, w : w u}: We have x u, and so the Jacobian is We thus have x w, y u, J det f U,W u, w f X,Y w, u w J y w ue u, w u, otherwise Pitfall: We must remember to specify the region on which the pdf f U,W is non-zero; this is the region on which f X,Y is non-zero, mapped through the transformation x, y u, v The same applies to almost all questions in this section b Therefore, the marginal pdf of U is given by f U u f U,W u, w dw u ue u dw ue u [w] u u e u Γ3 u3 e u for u, and otherwise, which means that U Ga3, From Question 5, we have E[X + Y k ] E[U k k ] k 34 + k which, for k 5, gives E[X + Y 5 ] ! 5 54 Let X and Y be a pair of independent random variables, both with the standard normal distribution Show that the joint pdf of U, V where U X and V X + Y is given by 8π f U,V u, v e v/ u / v u / for u v otherwise We have X U and Y V U The density f X,Y x, y f X xf Y y π e x +y / 36

37 is positive for all x, y R, and the transformation maps R to the set u, v : u v} We have x u u /, and so the Jacobian is J det Hence, for v u we have x v, u / v u / y u y v u / v v u / v u / 4 u / v u / f U,V u, v f X,Y u, v u J 8π e v/ u / v u /, and f U,V u, v otherwise 55 Let X, Y be a random vector with joint pdf e x+y x > y > ; f X,Y x, y otherwise a If U X Y and V Y/, find the joint pdf of U, V b Show that U and V are independent, and recognize their marginal distributions as standard distributions a We have f X,Y e x+y x > y > otherwise, and if u, v x y, y/ then x, y u+v, v The Jacobian of this transformation is det The region T x, y : x > y > } is described by x > y and y >, which respectively become u > and v > So, 4e u+4v for u >, v > f U,V u, v otherwise b f U,V u, v factorises as guhv where e u u > gu otherwise and hv 4e 4v v > otherwise, so U and V are independent We recognise gu and hv as the probability density functions of Exp and Exp4 random variables respectively, so U Exp and V Exp4 37

38 Pitfall: We could calculate the marginal distributions of U and V by integrating out variables, but it is much more efficient to spot that we can factorize f U,V u, v into f U uf V v, where f U and f V are probability density functions We know, in advance, that it will be possible to factorize in this way because they question told us that U and V would be independent The same issue also applies to many of the following questions 56 Let X and Y be a pair of independent and identically distributed random variables Let U X + Y and V X Y a Show that CovU, V, and give an example with justification to show that U and V are not necessarily independent b Show that U and V are independent in the special case where X and Y are standard normals a We have CovU, V E[X + Y X Y ] E[X + Y ]E[X Y ] E[X ] E[Y ] E[X] E[Y ] VarX VarY The last line follows because X and Y have the same distribution Hence, U and V are uncorrelated For the example, suppose that the common distribution of X and Y is such that they are equal to with probability and equal to with corresponding probability Then, P[U ] 4, and if U then X Y, which means V So U and V are not independent; P[U, V ] 4 P[U ]P[V ] P[X Y ]P[X Y ] 6 In fact, most possible examples result in U and V not being independent, and this one is chosen for its simplicity b If X and Y are standard normals, then by independence we have f X,Y x, y π e x +y / The transformation U X + Y, V X Y maps R to R, and has inverse transformation X U + V, Y U V We have and hence the Jacobian is x u, x v, J det y u, y v 38

39 Hence, the pdf of U, V is given by f U,V u, v f X,Y u + v, u v J 4π exp 8 u + v + u v +v /4 4π e u e u /4 e v /4 4π 4π Hence, U and V are independent, and both U and V have the N, distribution 57 Let X and Y be independent random variables with distributions Gaα, β and Gaα, β respectively Show that the random variables U X and V X + Y are X+Y independent with distributions Beα, α and Gaα + α, β respectively Since X and Y are independent we have f X,Y x, y f X xf Y y βα x βα Γα xα e βx Γα yα e βy β α +α Γα Γα xα y α e βx+y 5 If v x + y and u x+y then, to find the inverse transformation we observe that uv x + y x x+y x, so x uv and hence y v uv The conditions x > and y > translate to v > and < u < The partial derivatives are x u v, x v u, y u v, y v u and the Jacobian is v u J det v u + uv v uv + uv v v u Hence, for v > and u,, the joint pdf of U and V is f U,V u, v f X,Y uv, v uvv βα +α For all other u, v, we have f U,V u, v Γα Γα uvα v uv α e βv v β α +α Γα Γα uα u α v α +α e βv At this point we can recall that βα, α Γα +α Γα Γα and spot that f U,V u, v factorises as Bα f U,V u, v,α uα u α β α +α Γα +α vα +α e βv for v >, u, otherwise It follows immediately that V Gaα + α, β and that U and V are independent 58 As part of Question 57, we showed that if X and Y are independent random variables with X Gaα, β and Y Gaα, β, then X + Y Gaα + α, β 39

40 a Use induction to show that for n, if X, X,, X n are independent random variables with X i Gaα i, β then X i Ga α i, β b Hence show that for n, if Z, Z,, Z n are independent standard normal random variables then Zi χ n You may use the result of Example, which showed that this was true in the case n and, recall that the χ distribution is a special case of the Gamma distribution a The base case n is the given result from Question 57 If the claim is true for n k, then let X k X k i and Y X k+, so that X Ga α i, β and Y Gaα k+, β Then the same result from Question 57 shows that k+ k+ X i X + Y Ga α i, β, so the claim is true for n k + Hence it is true for all n by induction b Recall that the χ n distribution is the same as the Ga n, distribution In particular each Zi Ga,, so the result of a tells us that n Z i Ga n, χ n as required 59 Let X and Y be a pair of independent random variables, both with the standard normal distribution Show that U X/Y has the Cauchy distribution, with pdf for all u R f U u π + u The joint density f X,Y x, y f X xf Y y π e x +y / is positive for all x, y R We define U X/Y and V X which makes sense because P[Y ] We set u x/y and v x, and note that this transformation maps R to R The inverse transformation is x v and y v/u, which gives and hence the Jacobian is x u, x v, y u v u, J det v u 4 v u u y v u,

41 Hence, f U,V u, v f X,Y v, v/u J π exp v + /u v u for all u, v R To obtain f U u we integrate out v, giving f U u π π π π π u v u exp v u exp + /u + u + u v + /u v + /u [ exp dv dv v + /u exp v + /u ] This matches the pdf of the Cauchy distribution v + /u v dv 5 Let n N The t distribution often known as Student s t distribution is the univariate random variable X with pdf f X x Γ n+ nπγ n n+ + x, n for all x R Here, n is a parameter, known as the number of degrees of freedom Let Z be a standard normal random variable and let W be a chi-squared random variable with n degrees of freedom, where Z and W are independent Show that X Z W/n has the t distribution with n degrees of freedom To find the probability density function of X, consider Z, W as a bivariate random vector, and transform it to X, Y where X Z as above and Y W W/n By independence and the formulae for the density functions of the Normal and chi-squared distributions, f Z,W z, w n Γn/ π w n exp z + w, for w > We set x z/ w/n and y w, and note that the inverse of this transformation is given by w y and z x y/n Moreover, this transformation maps z, w : z R, w > } to x, y : x R, y > } The Jacobian is det y/n xy/n / y/n, /n 4

42 so the joint pdf of X and Y is f X,Y x, y n Γn/ π y n exp when y >, and equal to zero when y To obtain the pdf of X, we integrate out y: By Lemma 3, giving which simplifies to as required f X x x y n Γn/ πn y n exp y f X,Y x, y dy n Γn/ πn y n exp y x n + f X x dy n + y y/n x n +, y n exp y x n + x n+/ n + Γ x n+/ n Γn/ πn n + Γ f X x Γ n+ nπγ n n+ + x, n n + dy n +,, Challenge Questions 5 The formula 5, from the typed lecture notes, holds whenever the transformation u ux, y, v vx, y is both one-to-one and onto, and for which the Jacobian matrix of derivatives exists Use this fact to provide an alternative proof of Lemma 3 ie of the univariate transformation formula Hint Take a random variable X and a function g that satisfies the conditions of Lemma 3 Let W be a continuous random variable that is independent of X Define y gx, z w, and consider the pair Y, Z gx, W Find f Y,Z y, z, using equation 5, and integrate out z to obtain f Y y 5 Let X Expλ and Y Expλ be independent Show that U minx, Y has distribution Expλ + λ, and that P[minX, Y X] λ λ +λ Extend this result, by induction, to handle the minimum of finitely many exponential random variables Let W maxx, Y Show that U and W U are independent Hint Calculate P [minx, Y z] directly by partitioning on the event X < Y 4

43 6 Covariance Matrices and and Multivariate Normal Distributions Warm-up Questions 6 Let A and Σ 4 9 a Calculate deta and find both AΣ and AΣA T b Let A be the vector, Show that AΣA T a We have deta 3, and 4 5 AΣ AΣA T Alternatively, we could calculate ΣA T first and then pre-multiply by A b We have AΣA T Ordinary Questions 6 Let X X, Y T be a random vector with E[X], CovX Let U X + Y and V X Y + Write U U, V T a Write down a square matrix A and a vector b R such that U AX + b b Show that the mean vector and covariance matrix of U are given by 5 E[U], CovU 4 c Show that the correlation coefficient ρ of U and V is equal to 5 a We have U V so we take A and b 43 X Y +

44 b Using Lemma 63 we have and E[U] AE[X] + b + CovU A CovXA T c From the covariance matrix of U, ρu, V + CovU, V VarU VarV Let X and Y be independent standard normal random variables a Write down the covariance matrix of the random vector X X, Y T b Let R be the rotation matrix cos θ sin θ sin θ cos θ Show that RX has the same covariance matrix as X a Since X and Y are standard normals they both have variance, and by independence their covariance is zero Hence CovX This is the identify matrix, which we denote by I b We have CovRX R CovXR T RIR T RR T, and evaluating RR T gives cos θ sin θ cos θ sin θ cos θ + sin θ cos θ sin θ sin θ cos θ sin θ cos θ sin θ cos θ sin θ cos θ cos θ sin θ sin θ + cos θ which is the identity matrix I Since CovX I, we are done 64 Three univariate random variables X, Y and Z have means 3, 4 and 6 respectively and variances, and 5 respectively Further, X and Y are uncorrelated; the correlation coefficient between X and Z is 5 and that between Y and Z is 5 Let U X + Y Z and W X + Z 4 and set U U, W T a Find the mean vector and covariance matrix of X X, Y, Z T b Write down a matrix A and a vector b such that U AX + b c Find the mean vector and covariance matrix of U 44

45 d Evaluate E[X + Z 6 ] a We have E[X, Y, Z T ] E[X], E[Y ], E[Z] T 3, 4, 6 T For the covariance matrix, we have CovX, Y since X and Y are uncorrelated We are given that ρ X,Z 5, so CovX,Z 5 5 and hence CovX, Z Similarly ρ Y,Z 5 gives CovX, Z So, X Cov Y Z 5 b We define A and b by c It holds that U W U E AE[X] + b W X Y Z + 3 AX + b and U Cov A CovXA T W d We observe E[X + Z 6 ] E[W ] From b we have VarW VarW 33 and E[W ] 8, so E[W ] 6 Hence, by rearranging VarW E[W ] E[W ], E[W ] VarW + E[W ] Suppose that the random vector X X, X T follows the bivariate normal distribution with E[X ] E[X ], VarX, CovX, X and VarX 5 a Calculate the correlation coefficient of X and X Are X and X independent? b Find the mean and the covariance matrices of Y X and Z What are the distributions of Y and Z? a We have ρ CovX, X varx varx σ σ σ X X and X are positively correlated and they are not independent 45

46 b We have and CovY, CovX E[Y ], E[X],, 5 5, E[Z] E[X] T CovZ CovX T 9 In the cases of both Y and Z, the transformations which obtain them from X are linear Hence, both Y and Z are normally distributed Because the normal distribution is determined completely from its mean and its covariance matrix, it follows that the distributions of Y and Z are [ ] 9 63 Y N, 9 and Z N, Let X and X be bivariate normally distributed random variables each with mean and variance, and with correlation coefficient ρ a By integrating out the variable x in the joint pdf, verify that the marginal distribution of X is indeed that of a standard univariate normal random variable Hint: Use the fact that the integral of a Nµ, σ pdf is equal to b Show, using the usual formula for the conditional pdf that the conditional pdf of X given X x is Nρx, ρ a We have f X,X x, x π ρ exp ρ so integrating out x gives f X x π ρ π ρ [ } x ρx x + x ] [ x ρx x + x ] }, exp ρ dx [ } x ρx ρx + x ] exp ρ dx by completing the square x π ρ exp ρ } ρ exp x ρx } ρ dx by taking terms which don t involve x outside the integral } exp x π π ρ exp x ρx } ρ dx 46

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

MAS223 Statistical Modelling and Inference Examples

MAS223 Statistical Modelling and Inference Examples Chapter MAS3 Statistical Modelling and Inference Examples Example : Sample spaces and random variables. Let S be the sample space for the experiment of tossing two coins; i.e. Define the random variables

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

SOLUTION FOR HOMEWORK 11, ACTS 4306

SOLUTION FOR HOMEWORK 11, ACTS 4306 SOLUTION FOR HOMEWORK, ACTS 36 Welcome to your th homework. This is a collection of transformation, Central Limit Theorem (CLT), and other topics.. Solution: By definition of Z, Var(Z) = Var(3X Y.5). We

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Probability: Handout

Probability: Handout Probability: Handout Klaus Pötzelberger Vienna University of Economics and Business Institute for Statistics and Mathematics E-mail: Klaus.Poetzelberger@wu.ac.at Contents 1 Axioms of Probability 3 1.1

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION.

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION. A self published manuscript ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 21 EDITION. M I G U E L A R C O N E S Miguel A. Arcones, Ph. D. c 28. All rights reserved. Author Miguel A.

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on

More information

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Conditional densities, mass functions, and expectations

Conditional densities, mass functions, and expectations Conditional densities, mass functions, and expectations Jason Swanson April 22, 27 1 Discrete random variables Suppose that X is a discrete random variable with range {x 1, x 2, x 3,...}, and that Y is

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Continuous Distributions

Continuous Distributions Chapter 5 Continuous Distributions 5.1 Density and Distribution Functions In many situations random variables can take any value on the real line or in a certain subset of the real line. For concrete examples,

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

1 Introduction. P (n = 1 red ball drawn) =

1 Introduction. P (n = 1 red ball drawn) = Introduction Exercises and outline solutions. Y has a pack of 4 cards (Ace and Queen of clubs, Ace and Queen of Hearts) from which he deals a random of selection 2 to player X. What is the probability

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

3-1. all x all y. [Figure 3.1]

3-1. all x all y. [Figure 3.1] - Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Mathematics 1 Lecture Notes Chapter 1 Algebra Review

Mathematics 1 Lecture Notes Chapter 1 Algebra Review Mathematics 1 Lecture Notes Chapter 1 Algebra Review c Trinity College 1 A note to the students from the lecturer: This course will be moving rather quickly, and it will be in your own best interests to

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information