Stochastic models and their distributions

Similar documents
Vehicle Arrival Models : Headway

Introduction to Probability and Statistics Slides 4 Chapter 4

5. Stochastic processes (1)

Homework 4 (Stats 620, Winter 2017) Due Tuesday Feb 14, in class Questions are derived from problems in Stochastic Processes by S. Ross.

TMA 4265 Stochastic Processes

20. Applications of the Genetic-Drift Model

Transform Techniques. Moment Generating Function

Answers to QUIZ

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Comparison between the Discrete and Continuous Time Models

Statistical Distributions

Final Spring 2007

An random variable is a quantity that assumes different values with certain probabilities.

Reliability of Technical Systems

Math 10B: Mock Mid II. April 13, 2016

Avd. Matematisk statistik

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

Basic notions of probability theory (Part 2)

. Now define y j = log x j, and solve the iteration.

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Discrete Markov Processes. 1. Introduction

Solution of Assignment #2

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

MA 214 Calculus IV (Spring 2016) Section 2. Homework Assignment 1 Solutions

Chapter 7: Solving Trig Equations

6. Stochastic calculus with jump processes

An Introduction to Malliavin calculus and its applications

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Stochastic Modelling in Finance - Solutions to sheet 8

Chapter 2. First Order Scalar Equations

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

Echocardiography Project and Finite Fourier Series

Chapter 6. Systems of First Order Linear Differential Equations

arxiv: v1 [math.pr] 19 Feb 2011

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function

THE WAVE EQUATION. part hand-in for week 9 b. Any dilation v(x, t) = u(λx, λt) of u(x, t) is also a solution (where λ is constant).

Christos Papadimitriou & Luca Trevisan November 22, 2016

Matlab and Python programming: how to get started

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004

EXERCISES FOR SECTION 1.5

f(s)dw Solution 1. Approximate f by piece-wise constant left-continuous non-random functions f n such that (f(s) f n (s)) 2 ds 0.

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

Distribution of Estimates

in Engineering Prof. Dr. Michael Havbro Faber ETH Zurich, Switzerland Swiss Federal Institute of Technology

Lecture 4 Notes (Little s Theorem)

Comparing Means: t-tests for One Sample & Two Related Samples

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t

Some Basic Information about M-S-D Systems

Math 116 Practice for Exam 2

Math 333 Problem Set #2 Solution 14 February 2003

Asymptotic Equipartition Property - Seminar 3, part 1

1 Review of Zero-Sum Games

Lecture 6: Wiener Process

23.5. Half-Range Series. Introduction. Prerequisites. Learning Outcomes

Approximation Algorithms for Unique Games via Orthogonal Separators

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

Homework 10 (Stats 620, Winter 2017) Due Tuesday April 18, in class Questions are derived from problems in Stochastic Processes by S. Ross.

Appendix to Creating Work Breaks From Available Idleness

Weyl sequences: Asymptotic distributions of the partition lengths

Maintenance Models. Prof. Robert C. Leachman IEOR 130, Methods of Manufacturing Improvement Spring, 2011

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Bernoulli numbers. Francesco Chiatti, Matteo Pintonello. December 5, 2016

Expert Advice for Amateurs

6.003 Homework #9 Solutions

u(x) = e x 2 y + 2 ) Integrate and solve for x (1 + x)y + y = cos x Answer: Divide both sides by 1 + x and solve for y. y = x y + cos x

Linear Cryptanalysis

Linear Response Theory: The connection between QFT and experiments

T. J. HOLMES AND T. J. KEHOE INTERNATIONAL TRADE AND PAYMENTS THEORY FALL 2011 EXAMINATION

Right tail. Survival function

PROBLEMS FOR MATH 162 If a problem is starred, all subproblems are due. If only subproblems are starred, only those are due. SLOPES OF TANGENT LINES

Stochastic Model for Cancer Cell Growth through Single Forward Mutation

Lecture 33: November 29

Random Processes 1/24

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients

Average Number of Lattice Points in a Disk

dt = C exp (3 ln t 4 ). t 4 W = C exp ( ln(4 t) 3) = C(4 t) 3.

Lecture 2 April 04, 2018

2.7. Some common engineering functions. Introduction. Prerequisites. Learning Outcomes

2. Nonlinear Conservation Law Equations

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Math 106: Review for Final Exam, Part II. (x x 0 ) 2 = !

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references.

Predator - Prey Model Trajectories and the nonlinear conservation law

Lecture 4: Processes with independent increments

Guest Lectures for Dr. MacFarlane s EE3350 Part Deux

Basic definitions and relations

556: MATHEMATICAL STATISTICS I

Chapter 15: Phenomena. Chapter 15 Chemical Kinetics. Reaction Rates. Reaction Rates R P. Reaction Rates. Rate Laws

SELBERG S CENTRAL LIMIT THEOREM ON THE CRITICAL LINE AND THE LERCH ZETA-FUNCTION. II

Stability and Bifurcation in a Neural Network Model with Two Delays

Challenge Problems. DIS 203 and 210. March 6, (e 2) k. k(k + 2). k=1. f(x) = k(k + 2) = 1 x k

SMT 2014 Calculus Test Solutions February 15, 2014 = 3 5 = 15.

dy dx = xey (a) y(0) = 2 (b) y(1) = 2.5 SOLUTION: See next page

Homework sheet Exercises done during the lecture of March 12, 2014

Inventory Control of Perishable Items in a Two-Echelon Supply Chain

Transcription:

Sochasic models and heir disribuions Couning cusomers Suppose ha n cusomers arrive a a grocery a imes, say T 1,, T n, each of which akes any real number in he inerval (, ) equally likely The values T 1,, T n are called arrival imes, and hey are independen and idenically disribued (iid) uniform random variables on (, ) By N(s) we denoe he number of cusomers arriving in he ime inerval (, s) for s Probabiliy densiy funcion The probabiliy densiy funcion (pdf) for each arrival ime is given by { 1/ if < x < ; f(x) oherwise [ha is, if x or x] The cusomers independenly arrive in he ime inerval (, s) wih probabiliy p : ( s ) Then he frequency funcion for N(s) is given by P {N(s) k} ( n k ) (s ) k ( 1 s ) n k, (11) and i is called a binomial disribuion wih success probabiliy p ( s ) and he number n of rials Order saisics and arrival ime Le X 1,, X n be iid random variables wih he common cumulaive disribuion funcion (cdf) F (x) and he pdf f(x) When we sor X 1,, X n as X (1) < X (2) < < X (n), he random variable X (k) is called he k-h order saisic Then we have he following heorem Theorem The pdf of he k-h order saisic X (k) is given by f k (x) (n k)! f(x)f k 1 (x)[1 F (x)] n k (12) Probabiliy of small inerval he proof of heorem If f is coninuous a x and δ > is small, hen he probabiliy ha a random variable X falls ino he small inerval [x, x + δ] is approximaed by δf(x) In differenial noaion, we can wrie P (x X x + dx) f(x)dx Now consider he k-h order saisic X (k) and he even ha x X (k) x + dx This even occurs when (k 1) of X i s are less han x and (n k) of X i s are greaer han x x + dx Since here are ( n k 1,1,n k) arrangemens for Xi s, we can obain he probabiliy of his even as which implies (12) f k (x)dx (n k)! (F (x))k 1 (f(x)dx)[1 F (x)] n k, Page 1 Special lecure/june 216

Bea disribuion In paricular when X 1,, X n be iid uniform random variables on [, 1], he pdf of he k-h order saisic X (k) becomes f k (x) (n k)! xk 1 (1 x) n k, x 1, which is called bea disribuion wih parameers α k and β n k + 1 Now consider he arrival imes T 1,, T n in our model and heir ordered saisics T (1) < < T (n) Then i is no hard o see ha T (k) has a bea disribuion wih (k, n + 1 k) Poisson approximaion The Poisson disribuion wih parameer λ (λ > ) has he frequency funcion λ λk p(k) e, k, 1, 2, and may be used as an approximaion for a binomial disribuion wih parameer (n, p) if n is large and p is small enough so ha np is a moderae number λ Le λ np Then he binomial frequency funcion becomes p(k) (n k)! λk 1 e λ λ λk 1 e ( ) k ( λ 1 λ ) n k λk n n as n ( (n k)!n 1 λ n ( 1 n) λ ) k k n In he previous model λ ( n) represens he average number of cusomers per hour If p s is small enough, we can apply he Poisson approximaion o ge λs n λs (λs)k P {N(s) k} e Problem 1 Suppose ha X and Y are independen random variables disribued as Poisson disribuions wih he respecive parameers α and β Le N X +Y Calculae he condiional probabiliy P (X k N n) Problem 2 Suppose ha a random variable N has a Poisson disribuion wih parameer λ, and ha a random variable X is condiionally disribued as a binomial disribuion wih parameer (n, p) given N n Le Y N X (a) Calculae P (X j, Y k) (b) Show ha boh X and Y have Poisson disribuions (c) Show ha X and Y are independen Poisson process Maybe a fixed size n of cusomers is no so realisic So we assume ha he oal size N of cusomers is disribued as a Poisson disribuion wih parameer (λ) Here (λ) represens he average oal size of cusomers over he inerval (, ) Le us compue Page 2 Special lecure/june 216

P {N(s) k} exacly: Noicing ha (11) becomes a condiional probabiliy here, we can compue P {N(s) k} nk P (N(s) k N n)p {N n} nk ( s ) k ( 1 s ) n k e λ (λ)n (n k)! e λ (λs) k nk (λ( s)) n k (n k)! λs (λs)k e So ha P {N(s) k} has a Poisson disribuion wih parameer λs Observe ha P {N(s) k} does no depend on and can be seen as a process couning up cusomers arriving by he ime s Thus, N(s) is called a Poisson process Problem 3 Le N(s), s, be a Poisson process as consruced as above (a) Show ha N() N(s) is disribued as a Poisson disribuion, and ha N() N(s) is independen of N(s) Then calculae E[(N() N(s))N(s)] (b) Using he formula Var(X) E[X 2 ] (E[X]) 2, calculae E[N(s) 2 ] Then find E[N(s)N()] Problem 4 Provided N() n, we can consider he ( arrival imes T (1), ), T (n) observed over T(k) (, ) (a) Find he condiional probabiliy G n (u) P u N() n on [, 1] (b) Show ha he cdf G n (u) has he bea densiy funcion wih (k, n + 1 k) Hin: In he above consrucion of Poisson process, we find ha he condiional probabiliy P (N(s) k N() n) has a binomial disribuion wih parameer (n, s ) for < s < Problem 5 Suppose ha cusomers arrive a a car dealership as a Poisson process wih average number λ of cusomers per hour, and ha a cusomer makes no decision (ie, no purchasing a car) wih probabiliy α >, independenly of oher cusomers Thus, none of he firs k cusomers purchases a car from he dealership wih probabily α k Wha is he probabiliy ha no car is sold by ime? Gamma disribued business hours Now we wan o change he assumpion on he arrival imes of cusomers Insead of he cusomers disribued uniformly over (, ), he waiing imes beween cusomers, he ime W 1 he firs cusomer arrives, he ime W 2 beween he firs cusomer and he second, and so on, are independenly and exponenially disribued wih parameer λ The successive waiing imes W 1, W 2,, are now iid exponenial random variables having he pdf { λe λx x ; f(x) x < Here he reciprocal λ 1 of he average number of cusomers par hour becomes he average waiing ime beween arrivals Thus, if he grocer likes o close his shop a he arrival of he (n + 1)h cusomer, such ime T n+1 is now a random variable Gamma disribuion The arrival ime of he kh cusomer T (k) W 1 + + W k (13) Page 3 Special lecure/june 216

has a gamma disribuion wih parameer (k, λ) The pdf for gamma disribuion is given by where f() Γ(x) : λk Γ(k) k 1 e λ u x 1 e u du, x >, is called he gamma funcion When x k, i is simply Γ(k) We call he parameer k a shape parameer, because he change of k causes ha of he shape of densiy We call he parameer λ a scale parameer, because he change of λ merely rescales he densiy wihou changing is shape In paricular, he gamma disribuion wih k 1 becomes an exponenial disribuion Gamma disribuion and Poisson process The grocer has a mos k cusomers a he ime s if he (k + 1)h cusomer has no ye arrived This means, in erms of probabiliy, ha P {N(s) k} P {T (k+1) > s} for k, 1, (14) Here we claim by mahemaical inducion ha P {T (k+1) > s} k λs (λs)i e (15) i! i Clearly, P {N(s) } e λs P {T (1) > s} Now suppose ha (15) holds for k replaced by k 1 We have λe λx (λx) k P {T (k+1) > s} dx e λx (λx) k λe λx (λx) k 1 s + dx s s λs (λs)k k λs (λs)i e + P {T (k) > s} e, i! which has verified (15) Therefore, combining (14) and (15), we found ha P {N(s) k} has a Poisson disribuion wih parameer λs Gamma and bea disribuion Noice ha T (k) and T (n+1) T (k) W k+1 + + W n+1 are independen and each disribued as a gamma disribuion wih respecive parameers (k, λ) and (n + 1 k, λ) And i is known ha i T (k) T (n+1) has a bea disribuion wih (k, n + 1 k) A doomed bus commuer (Feller, An Inroducion o Probabiliy Theory and Is Applicaions Vol 2, Secion 14, Waiing ime paradoxes) Timmy commues by he buses everyday The bus ransi auhoriy claims ha heir buses arrive in average every λ 1 1 minues Today he came o he bus sop a ime 66 (which is 5:pm measured from he sar of service a 6:am) How long is he expeced o wai for he nex bus? A sochasic model We can assume ha he successive waiing imes beween buses, say W 1, W 2,, are independen and exponenially disribued wih parameer λ Thus, he number Page 4 Special lecure/june 216

N() of buses having arrived by he ime has a Poisson disribuion wih parameer λ Le T (k) denoe he arrival ime of he kh bus; T (k) is again given by (13) Then his waiing ime V is formulaed as V : T (N()+1) Consider he even A : {V v} If N() occurs, hen A {N() } { < W 1 + v}, which gives P ( < W 1 + v) e λ e λ(+v) (1 e λv ) e λ (16) If N() k occurs for k 1, 2,, hen A {N() k} { T (k) < W k+1 + v T (k), T (k) < } Noice ha T (k) and W k+1 are independen, and herefore, hey have he join disribuion The waiing ime Now we can calculae f Wk+1,T (k) (x, y) λe λx λe λy (λy) k 1 P ( T (k) < W k+1 + v T (k), T (k) < ) f Wk+1,T (k) (x, y) dx dy {(x,y): y<x<+v y, y<} λe λy (λy) k 1 [e λ( y) e λ(+v y) ] dy (1 e λv ) e λ λ(λy) k 1 dy (1 e λv λ (λ)k ) e (17) By (16) (17), we obain P (V v) 1 e λv, where we have applied (λ) k k e λ Hence, V has an exponenial disribuion wih λ, and he expeced waiing ime E[V ] is λ 1 1 minues We can also observe ha P (V v, N() k) P (V v) P (N() k), and herefore, ha V and N() are independen Problem 6 Wha happens o he enire waiing ime U beween buses observed a he ime? This random variable is formulaed as U : T (N()+1) T (N()) By using an argumen similar o he above, verify ha { 1 (1 + λu)e λu if u < ; P (U u) 1 (1 + λ)e λu if u, and ha E[U ] (2 e λ )λ 1 This indicaes ha he waiing ime is much larger han he average λ 1 Hin: Le f (u) be he densiy funcion of U For he calculaion of expecaion we can use E[U ] uf (u) du f (u) du u dx dx f (u) du x P (U > x) dx Problem 7 The number of cusomers arriving a he grocer is a Poisson process wih average number λ of cusomers per hour Page 5 Special lecure/june 216

(a) Le T be he ime a which he firs cusomer arrives Wha is he disribuion of T? (b) Le > be fixed, and le T be he arrival ime of cusomer who shows up firs afer he ime Wha is he disribuion of T? Page 6 Special lecure/june 216

Summary of relaed disribuions Noe #1 Summary of relaed disribuions n Uniform on (, ) f(x) 1, < x < N(s) E[X] /2, Var(X) 2 /12 P {N(s) i} T (k) / T (1) s T (n) Binomial(n, p) ( ) n P {X i} p i (1 p) n i i E[X] np, Var(X) np(1 p) if p is small enough P {X i} e E[X] λ, Poisson(λ) P (N(s) k N() n) N() λ λi, i, 1, i! Var(X) λ Bea(k, n + 1 k) Γ(n + 1) f(x) Γ(k)Γ(n + 1 k) xk 1 (1 x) n k, < x < 1 E[X] k k(n + 1 k), Var(X) n + 1 (n + 1) 2 (n + 2) T (k) T (n+1) Gamma(n, λ) f(x) λe λx (λx) n 1, x Γ(n) E[X] n λ, Var(X) n λ 2 Exponenial(λ) f(x) λe λx, x E[X] 1 λ, Var(X) 1 λ 2 T (k) P {N(s) k} P {T (k+1) > s} N() n T (1) T (n) T (n+1) Page 7 Special lecure/june 216

Problem soluions Noe #1 Problem soluions Problem 1 Observe ha N is also a Poisson random variable wih parameer (α + β) Then we can calculae P (X k N n) P (X k, N n) P (N n) P (X k) P (Y n k) P (N n) ( ) ( ) k ( ) n k n α β k α + β α + β which is a binomial disribuion wih parameer Problem 2 (a) We can calculae ( n, α α+β P (X k, Y n k) P (N n) ) αk β βn k e α e (n k)! e (α+β) (α+β)n P (X j, Y k) P (X j, N j + k) P (X j N j + k) P (N j + k) ( j + k )p j (1 p) k λ λj+k (λp)j e e λ j (j + k)! j! (b) Using (a) we obain λ (λp)j P (X j) P (X j, Y k) e j! k (λ(1 p)) k k λp (λp)j e j! Similarly we can find ha Y has a Poisson disribuion wih parameer λ(1 p) (c) By applying (a) and (b), we have λp (λp)j P (X j, Y k) e j! which implies ha X and Y are independen λ(1 p) (λ(1 p))k e P (X j) P (Y k), Problem 3 (a) From he consrucion of Poisson process, have N() N(s) N N(s) In Problem 2 we found ha N N(s) has a Poisson disribuion wih parameer λ( s), and ha N(s) and N N(s) are independen Therefore, we have E[(N() N(s))N(s)] E[N() N(s)] E[N(s)] λ( s) λs (b) Since E[N(s) 2 ] λs + (λs) 2, we obain E[N(s)N()] λ 2 s + λs Problem 4 (a) Observe ha T (k) u if and only if N(u) k We obain n ( ) n G n (u) P (N(u) k N() n) u i (1 u) n i i (b) We can easily see ha g n (u) d du G n(u) ik (n k)!uk 1 (1 u) n k Γ(n + 1) Γ(k)Γ(n k + 1) uk 1 (1 u) n k Page 8 Special lecure/june 216

Problem soluions Noe #1 is he bea densiy funcion Problem 5 Le A be he even ha no car is sold Observe ha P (A N() k) α k Then we obain P (A) P (A N() k) P (N() k) k k α k λ (λ)k e Problem 6 Le B {U u} Case I: Suppose ha u < Then we have P (U u) P (B {N() k}) k P ( u T (k) <, T (k) < W k+1 u) k1 k1 u λe λy (λy) k 1 1 e λu λue λu Case II: Suppose ha u Then we have P (U u) P (B {N() k}) k P ( < W 1 u) + e λ e λu + k1 1 e λu λe λu Applying he formula in he hin, we obain E[U ] [e λu + λue λu ] du + u dy λe λx dx y e λ e λα e λ(1 α) P ( T (k) <, T (k) < W k+1 u) k1 λe λy (λy) k 1 u dy λe λx dx y [e λu + λe λu ] du (2 e λ )λ 1 Problem 7 (a) T T (1) W 1 has an exponenial disribuion wih λ (b) Using an exponenial random variable V wih λ, we can express T T (N()+1) + V, which has he densiy f(x) of shifed exponenial disribuion: { if x < ; f(x) λe λx if x Page 9 Special lecure/june 216