Applied Stochastic Models (SS 09)

Similar documents
Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Final Exam # 3. Sta 230: Probability. December 16, 2012

18.440: Lecture 28 Lectures Review

Riemann integral and volume are generalized to unbounded functions and sets. is an admissible set, and its volume is a Riemann integral, 1l E,

2 Random Variable Generation

MATH2715: Statistical Methods

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Probability and Distributions

ECE Lecture #9 Part 2 Overview

Practice Examination # 3

Chp 4. Expectation and Variance

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MA 519 Probability: Review

Introduction to Machine Learning

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

18.440: Lecture 28 Lectures Review

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Lecture 2: Repetition of probability theory and statistics

MAS223 Statistical Inference and Modelling Exercises and Solutions

More on Bayes and conjugate forms

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

5 Operations on Multiple Random Variables

Statistics (1): Estimation

1 Boas, problem p.564,

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

2 (Statistics) Random variables

Order Statistics and Distributions

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

ACM 116: Lectures 3 4

More than one variable

February 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM

MAS3301 Bayesian Statistics Problems 2 and Solutions

Actuarial Science Exam 1/P

Chapter 4 : Expectation and Moments

conditional cdf, conditional pdf, total probability theorem?

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Practice Midterm 2 Partial Solutions

) k ( 1 λ ) n k. ) n n e. k!(n k)! n

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Continuous Random Variables and Continuous Distributions

1 Probability and Random Variables

Nonparametric hypothesis tests and permutation tests

Mathematical Statistics 1 Math A 6330

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Sampling Distributions

3. DISCRETE RANDOM VARIABLES

MAS223 Statistical Inference and Modelling Exercises

Transformations and Expectations

The integrals in Gradshteyn and Ryzhik. Part 10: The digamma function

Chapter 1. Sets and probability. 1.3 Probability space

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Generation from simple discrete distributions

1.1 Review of Probability Theory

Math221: HW# 7 solutions

The Gauss-Markov Model. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 61

Three hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

Algorithms for Uncertainty Quantification

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Basics of Stochastic Modeling: Part II

Lecture 5: Expectation

CS Lecture 19. Exponential Families & Expectation Propagation

1 Review of Probability

Homework 1: Solution

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 4 continued. Chapter 4 sections

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Sampling Distributions

Data Analysis and Monte Carlo Methods

Final Examination Solutions (Total: 100 points)

Lecture 4: Probability and Discrete Random Variables

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

1 Review of Probability and Distributions

Chapter 1. Statistical Spaces

STA 256: Statistics and Probability I

Chapter 2: Random Variables

Probability Review. Gonzalo Mateos

Statistika pro informatiku

Stochastic models and their distributions

MATH 728 Homework 3. Oleksandr Pavlenko

Hints and Answers to Selected Exercises in Fundamentals of Probability: A First Course, Anirban DasGupta, Springer, 2010

Brief Review of Probability

ECON 5350 Class Notes Review of Probability and Distribution Theory

Gaussian vectors and central limit theorem

Continuous Distributions

Section 5.5 More Integration Formula (The Substitution Method) 2 Lectures. Dr. Abdulla Eid. College of Science. MATHS 101: Calculus I

Midterm Examination. STA 215: Statistical Inference. Due Wednesday, 2006 Mar 8, 1:15 pm

F denotes cumulative density. denotes probability density function; (.)

Chapter 6. Integration. 1. Integrals of Nonnegative Functions. a j µ(e j ) (ca j )µ(e j ) = c X. and ψ =

Midterm Exam 1 Solution

4. Distributions of Functions of Random Variables

Continuous Random Variables

APPM/MATH 4/5520 Solutions to Problem Set Two. = 2 y = y 2. e 1 2 x2 1 = 1. (g 1

STAT 430/510: Lecture 16

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

3. Probability and Statistics

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Today: Fundamentals of Monte Carlo

Stochastic Models of Manufacturing Systems

Transcription:

University of Karlsruhe Institute of Stochastics Prof Dr P R Parthasarathy Dipl-Math D Gentner Applied Stochastic Models (SS 9) Problem Set Problem Let X, X 2 U(θ /2, θ + /2) be independent Show that the probability density function (pdf) of X X 2 is independent of θ Problem Let X and Y be random variables with distribution functions F and G respectively We say that X is stochastically dominated or dominated in distribution by Y if F (x) G(x), x R Denote this relation by X d Y Further, if F is a distribution function, its generalized inverse F : (, ) R is dened by F (u) inf{x R : F (x) u}, u (, ) (a) Prove that F (u) x if and only if u F (x) (b) Prove that if U U[, ], then X d F (U) (c) Prove that X d Y if and only if there exist two random variables ˆX and ˆX d X and Ŷ d Y (ie two copies of X and Y ) with the property ˆX Ŷ Ŷ such that Problem 2 Suppose X is distributed according to a Poisson distribution with random parameter Λ which is Γ(a, c) distributed (a >, c > ) Assume in addition c N Find the distribution of X Problem 3 Let X, Y U(, ) be independent Write U : min{x, Y } and V : max{x, Y } Find E[U], E[V ] and calculate Cov(U, V ) Problem 4 Let n N and Y Bin(n, X), where X is a random variable following a beta distribution on (, ) (with parameters p, q > ) Find the distribution of Y What happens if X is uniform on (, )?

Solutions: Problem Let X, X 2 U(θ /2, θ + /2) be independent Show that the probability density function (pdf) of X X 2 is independent of θ Obviously, if we set Y : X θ, Y 2 : X 2 θ, then Y, Y 2 U( /2, /2), which is independent of θ Hence the distribution (in particular the pdf) of X X 2 Y Y 2 is independent of θ Problem Let X and Y be random variables with distribution functions F and G respectively We say that X is stochastically dominated or dominated in distribution by Y if F (x) G(x), x R Denote this relation by X d Y Further, if F is a distribution function, its generalized inverse F : (, ) R is dened by F (u) inf{x R : F (x) u}, u (, ) (a) Prove that F (u) x if and only if u F (x) (b) Prove that if U U[, ], then X d F (U) (c) Prove that X d Y if and only if there exist two random variables ˆX and ˆX d X and Ŷ d Y (ie two copies of X and Y ) with the property ˆX Ŷ Ŷ such that (a) If F (u) x, then by monotonicity of F, we have F (F (u)) F (x) The denition of F and right-continuity of F imply on the other hand that F (F (u)) u, which yields together u F (x) Conversely, if F (x) u, then x {x R : F (x ) u}, hence (b) We have x inf{x R : F (x ) u} F (u) P (F (U) x) (a) P (U F (x)) F (x) P (X x) which is equivalent to F (U) d X (c) First assume the existence of ˆX, Ŷ such that ˆX d X, Ŷ d Y and ˆX Ŷ Then {Ŷ x} { ˆX x}, x R, hence P (Ŷ x) P ( ˆX x), thus G(x) F (x), x R, meaning X d Y

Conversely, if X d Y, then G(x) F (x), x R, which implies for u [, ] that {x R : G(x) u} {x R : F (x) u} Taing the inmum on both sides, this gives () F (u) G (u), u [, ] Now tae a U(, ) distributed random variable U and set ˆX : F (U), Ŷ : G (U) Then according to (b) these are copies of X and Y and () implies ˆX Ŷ Problem 2 Suppose X is distributed according to a Poisson distribution with random parameter Λ which is Γ(a, c) distributed (a >, c > ) Assume in addition c N Find the distribution of X The density of Λ is given by f Λ (x) ac Γ(c) xc e ax (, ) (x), x R Since X N, it is enough to compute P (X n), n N We nd P (X n) E[P (X n Λ)] f Λ (λ)p (X n Λ λ)dλ f Λ λ λn (λ)e n! dλ Γ(c)n! Γ(c) λc e aλ e Γ(c)n!() λ λn n! dλ λ c+n e (a+)λ dλ Γ(c + n) Γ(c)n!() c+n (c + n )(c + )c n! ( ) c+n x e x dx Hence X has negative-binomial distribution with parameters p formula ( x) ( ) n + x, x <, n+ n we can chec that this is a distribution indeed: P (X n) ) c ( a+ )c ) c ( ) n a a+ ) c ( ) c a and r c Using the

Problem 3 Let X, Y U(, ) be independent Write U : min{x, Y } and V : max{x, Y } Find E[U], E[V ] and calculate Cov(U, V ) We have U 2 (X + Y ) X Y, 2 V 2 (X + Y ) + X Y 2 One calculates the pdf of X Y as follows: { + x, < x < (,) (x y) (,) (y)dy x, < x < If f is the pdf of some random variable Z, then (f(x)+f( x)) (, ) (x) is the pdf of Z Hence X Y has pdf (2 2x) (,) (x) We then nd [ E X Y x(2 2x)dx x 2 2 ] 3 x3 2 3 3 Hence which implies E[V ] 2 3 For the covariance, we have E[U] E 2 (X + Y ) E 2 X Y 2 6 3, Cov(U, V ) E[(U E[U])(V E[V ])] E[UV ] E[U]E[V ] E[XY ] E[U]E[V ] E[X]E[Y ] E[U]E[V ] 4 2 9 36 Problem 4 Let n N and Y Bin(n, X), where X is a random variable following a beta distribution on (, ) (with parameters p, q > ) Find the distribution of Y What happens if X is uniform on (, )? X has pdf f X (x) B(p, q) xp ( x) q (,) (x), where B(p, q) Hence for {,,, n} we have P (Y ) E(P (Y X)) E ( ) n ( ) n [( ] n )X ( X) n x ( x) n B(p, q) xp ( x) q dx x p ( x) q dx ( ) n E [ X ( X) n ] B( + p, n + q) B(p, q)

Since we get P (Y ) B(p, q) Γ(p)Γ(q) Γ(p + q), ( ) n Γ(p + q) Γ(p)Γ(q) Γ( + p)γ(n + q) Γ(n + p + q) The formula Γ(x + ) xγ(x) yields, after some cancelations, ( ) n ( + p)p(n + q)q P (Y ) (n + p + q)(p + q) Finally, the case that X is uniform on (, ) is the special choice of p q In this case we have ( ) n!(n )! P (Y ) (n + )! n + This means Y is uniform on {,,, n}