University of Karlsruhe Institute of Stochastics Prof Dr P R Parthasarathy Dipl-Math D Gentner Applied Stochastic Models (SS 9) Problem Set Problem Let X, X 2 U(θ /2, θ + /2) be independent Show that the probability density function (pdf) of X X 2 is independent of θ Problem Let X and Y be random variables with distribution functions F and G respectively We say that X is stochastically dominated or dominated in distribution by Y if F (x) G(x), x R Denote this relation by X d Y Further, if F is a distribution function, its generalized inverse F : (, ) R is dened by F (u) inf{x R : F (x) u}, u (, ) (a) Prove that F (u) x if and only if u F (x) (b) Prove that if U U[, ], then X d F (U) (c) Prove that X d Y if and only if there exist two random variables ˆX and ˆX d X and Ŷ d Y (ie two copies of X and Y ) with the property ˆX Ŷ Ŷ such that Problem 2 Suppose X is distributed according to a Poisson distribution with random parameter Λ which is Γ(a, c) distributed (a >, c > ) Assume in addition c N Find the distribution of X Problem 3 Let X, Y U(, ) be independent Write U : min{x, Y } and V : max{x, Y } Find E[U], E[V ] and calculate Cov(U, V ) Problem 4 Let n N and Y Bin(n, X), where X is a random variable following a beta distribution on (, ) (with parameters p, q > ) Find the distribution of Y What happens if X is uniform on (, )?
Solutions: Problem Let X, X 2 U(θ /2, θ + /2) be independent Show that the probability density function (pdf) of X X 2 is independent of θ Obviously, if we set Y : X θ, Y 2 : X 2 θ, then Y, Y 2 U( /2, /2), which is independent of θ Hence the distribution (in particular the pdf) of X X 2 Y Y 2 is independent of θ Problem Let X and Y be random variables with distribution functions F and G respectively We say that X is stochastically dominated or dominated in distribution by Y if F (x) G(x), x R Denote this relation by X d Y Further, if F is a distribution function, its generalized inverse F : (, ) R is dened by F (u) inf{x R : F (x) u}, u (, ) (a) Prove that F (u) x if and only if u F (x) (b) Prove that if U U[, ], then X d F (U) (c) Prove that X d Y if and only if there exist two random variables ˆX and ˆX d X and Ŷ d Y (ie two copies of X and Y ) with the property ˆX Ŷ Ŷ such that (a) If F (u) x, then by monotonicity of F, we have F (F (u)) F (x) The denition of F and right-continuity of F imply on the other hand that F (F (u)) u, which yields together u F (x) Conversely, if F (x) u, then x {x R : F (x ) u}, hence (b) We have x inf{x R : F (x ) u} F (u) P (F (U) x) (a) P (U F (x)) F (x) P (X x) which is equivalent to F (U) d X (c) First assume the existence of ˆX, Ŷ such that ˆX d X, Ŷ d Y and ˆX Ŷ Then {Ŷ x} { ˆX x}, x R, hence P (Ŷ x) P ( ˆX x), thus G(x) F (x), x R, meaning X d Y
Conversely, if X d Y, then G(x) F (x), x R, which implies for u [, ] that {x R : G(x) u} {x R : F (x) u} Taing the inmum on both sides, this gives () F (u) G (u), u [, ] Now tae a U(, ) distributed random variable U and set ˆX : F (U), Ŷ : G (U) Then according to (b) these are copies of X and Y and () implies ˆX Ŷ Problem 2 Suppose X is distributed according to a Poisson distribution with random parameter Λ which is Γ(a, c) distributed (a >, c > ) Assume in addition c N Find the distribution of X The density of Λ is given by f Λ (x) ac Γ(c) xc e ax (, ) (x), x R Since X N, it is enough to compute P (X n), n N We nd P (X n) E[P (X n Λ)] f Λ (λ)p (X n Λ λ)dλ f Λ λ λn (λ)e n! dλ Γ(c)n! Γ(c) λc e aλ e Γ(c)n!() λ λn n! dλ λ c+n e (a+)λ dλ Γ(c + n) Γ(c)n!() c+n (c + n )(c + )c n! ( ) c+n x e x dx Hence X has negative-binomial distribution with parameters p formula ( x) ( ) n + x, x <, n+ n we can chec that this is a distribution indeed: P (X n) ) c ( a+ )c ) c ( ) n a a+ ) c ( ) c a and r c Using the
Problem 3 Let X, Y U(, ) be independent Write U : min{x, Y } and V : max{x, Y } Find E[U], E[V ] and calculate Cov(U, V ) We have U 2 (X + Y ) X Y, 2 V 2 (X + Y ) + X Y 2 One calculates the pdf of X Y as follows: { + x, < x < (,) (x y) (,) (y)dy x, < x < If f is the pdf of some random variable Z, then (f(x)+f( x)) (, ) (x) is the pdf of Z Hence X Y has pdf (2 2x) (,) (x) We then nd [ E X Y x(2 2x)dx x 2 2 ] 3 x3 2 3 3 Hence which implies E[V ] 2 3 For the covariance, we have E[U] E 2 (X + Y ) E 2 X Y 2 6 3, Cov(U, V ) E[(U E[U])(V E[V ])] E[UV ] E[U]E[V ] E[XY ] E[U]E[V ] E[X]E[Y ] E[U]E[V ] 4 2 9 36 Problem 4 Let n N and Y Bin(n, X), where X is a random variable following a beta distribution on (, ) (with parameters p, q > ) Find the distribution of Y What happens if X is uniform on (, )? X has pdf f X (x) B(p, q) xp ( x) q (,) (x), where B(p, q) Hence for {,,, n} we have P (Y ) E(P (Y X)) E ( ) n ( ) n [( ] n )X ( X) n x ( x) n B(p, q) xp ( x) q dx x p ( x) q dx ( ) n E [ X ( X) n ] B( + p, n + q) B(p, q)
Since we get P (Y ) B(p, q) Γ(p)Γ(q) Γ(p + q), ( ) n Γ(p + q) Γ(p)Γ(q) Γ( + p)γ(n + q) Γ(n + p + q) The formula Γ(x + ) xγ(x) yields, after some cancelations, ( ) n ( + p)p(n + q)q P (Y ) (n + p + q)(p + q) Finally, the case that X is uniform on (, ) is the special choice of p q In this case we have ( ) n!(n )! P (Y ) (n + )! n + This means Y is uniform on {,,, n}