Solutions For Stochastic Process Final Exam

Similar documents
The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Modelling data networks stochastic processes and Markov chains

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Interlude: Practice Final

Figure 10.1: Recording when the event E occurs

Modelling data networks stochastic processes and Markov chains

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Problem Points S C O R E Total: 120

Notes on Continuous Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Problem Points S C O R E Total: 120

EE126: Probability and Random Processes

T. Liggett Mathematics 171 Final Exam June 8, 2011

Renewal theory and its applications

Exponential Distribution and Poisson Process

7 Continuous Variables

Point Process Control

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Modern Discrete Probability Branching processes

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

TMA 4265 Stochastic Processes

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Exam Stochastic Processes 2WB08 - March 10, 2009,

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Optimal maintenance for minimal and imperfect repair models

Universal examples. Chapter The Bernoulli process

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

1 Poisson processes, and Compound (batch) Poisson processes

Stochastic Processes

Introduction to Queuing Theory. Mathematical Modelling

Other properties of M M 1

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Continuous distributions

The exponential distribution and the Poisson process

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Stochastic process. X, a series of random variables indexed by t

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Continuous Time Processes

QUEUING SYSTEM. Yetunde Folajimi, PhD

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Selected Exercises on Expectations and Some Probability Inequalities

Tentamentsskrivning: Stochastic processes 1

Math 414, Fall 2016, Test I

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

1 Continuous-time chains, finite state space

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

Introduction to queuing theory

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Common ontinuous random variables

Lecture 6 Random walks - advanced methods

Massachusetts Institute of Technology

Introduction to Queueing Theory

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Large Deviations for Weakly Dependent Sequences: The Gärtner-Ellis Theorem

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS

Introduction to Queueing Theory

Stochastic Processes

Lecture 17 Brownian motion as a Markov process

1 Basic concepts from probability theory

MATH 6605: SUMMARY LECTURE NOTES

Computer Systems Modelling

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

PAC Learning. prof. dr Arno Siebes. Algorithmic Data Analysis Group Department of Information and Computing Sciences Universiteit Utrecht

Lecture 21 Representations of Martingales

Continuous-time Markov Chains

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

EE126: Probability and Random Processes

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Probability and Statistics Concepts

Transform Techniques - CF

Arrivals and waiting times

Data analysis and stochastic modeling

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/25/17. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

Manual for SOA Exam MLC.

IEOR 6711, HMWK 5, Professor Sigman

Renewal processes and Queues

Markov Processes and Queues

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Discrete random structures whose limits are described by a PDE: 3 open problems

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

( )( b + c) = ab + ac, but it can also be ( )( a) = ba + ca. Let s use the distributive property on a couple of

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

I forgot to mention last time: in the Ito formula for two standard processes, putting

RENEWAL PROCESSES AND POISSON PROCESSES

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to queuing theory

Transcription:

Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) = e 2 (b) λ non-bmw = 20 90% = 8 X non-bmw Poisson(8) Let N t be the number of non-bmws which have passes during [0, t] Then the expectation in question is (c) (d) E(N 0 + N 0 N 0 = 5) = E(N 0 N 0 = 5) + E(N 0 N 0 = 5) = 5 + E(N 0) = 5 + λ non-bmw 0 = 5 + 8 0 = 95 P {N = 0 N 3 = 5} = P {N = 0, N 3 = 5} P {N 3 = 5} = P {N = 0, N 3 N = 5} P {N 3 = 5} = P {N = 0}P {N 3 N = 5} P {N 3 = 5} = e 2 e 4 45 5! e 6 6 5 5! = ( 2 3 )5 = 2 243 P {N 4 = 6, N 4 = 6 N 0 = 8} = P {N 4 = 6, N 0 = 8, N 4 = 6} P {N 0 = 8} = P {N 4 = 6, N 0 N 4 = 2, N 4 = 6} P {N 0 = 8} = P {N 4 = 6}P {N 0 N 4 = 2}P {N 4 = 6} P {N 0 = 8} = e 8 8 6 6! e 2 2 2 2! e 20 20 8 8! e 72 72 6 6! (e) E(T 50 ) = 50 2 = 25 The expected time in question is PM 2:25

2 Firstly, we compute 3 (a) E[Z(t) N(t) = n] = E[ = E[ = = Θ k (t) N(t) = n] k= ξ k e α(t Sk) N(t) = n] k= E[ξ k e α(t Sk) N(t) = n] k= E[ξ k N(t) = n]e[e e αs k N(t) = n] k= = E(ξ )e E[e αs k N(t) = n] k= = E(ξ )e E[e αs + e αs 2 + + e αs n N(t) = n] = E(ξ )e E[e αu + e αu 2 + + e αun ] = E(ξ )e ne[e αu ] = E(ξ )e n e = ne(ξ ) e ThenE[Z(t)] = E[E[Z(t) N(t) = n]] = E[N(t)E(ξ ) ] = E(ξ ) e E[N(t)] = λe(ξ ) e α {T = } = {X = 6} σ(x ) {T = 2} = {(X, X 2 ) X =, 2,, 5, X 2 = 6} σ(x, X 2 ) {T = 3} = {(X, X 2, X 3 ) X =, 2,, 5, X 2 =, 2,, 5, X 3 =, 2,, 6} σ(x, X 2, X 3 ) For k > 3, {T = k} = σ(x, X 2, X 3 ) T is a stopping time 2

(b) (c) S T = T X i i= E(S T ) = E(X + X 2 + + X T ) = E(X )E(T ) (by Wald s equation) = [( + 2 + + 6) 6 ] [ 6 + 2 5 6 6 + 3 (5 6 )2 ] = 7 2 9 36 = 637 72 n = : E(S T T = ) = E(X T = ) = 6 n = 2 : E(S T T = 2) = E(X + X 2 T = 2) = E(X T = 2) + E(X 2 T = 2) = ( + 2 + 5) + 6 = 9 5 n = 3 : E(S T T = 3) = E(X + X 2 + X 3 T = 3) = E(X T = 3) + E(X 2 T = 3) + E(X 3 T = 3) = 5 ( + 2 + + 5) + 5 ( + 2 + + 5) + ( + 2 + + 6) 6 = 95 (d) E(S T ) = E[(S T T = n)] = P (T = )E(S T T = ) + P (T = 2)E(S T T = 2) + P (T = 3)E)(S T T = 3) = 6 6 + 5 6 6 9 + 5 6 5 6 95 = 637 72 4 (a) Let x denote the number of machines in operation condition at the end of a period Let y denote the number of machines that have been repaired for one period Then the state space S = {(x, y)} = {(2, 0), (, 0), (, ), (0, 0), (0, )} P = (2, 0) (, 0) (, ) (0, 0) (0, ) (2, 0) ( α) 2 2α( α) 0 α 2 0 (, 0) 0 0 ( β) 0 β (, ) ( β) β 0 0 0 (0, 0) 0 0 0 0 (0, ) 0 0 0 0 (b) Let π = (π, π 2, π 3, π 4, π 5 ) be the limiting distribution of the Markov chain ( α) 2 π + ( β)π 3 = π 2α( α)π + βπ 3 + π 5 = π 2 ( β)π Solve 2 = π 3 α 2, π = π 4 βπ 2 + π 4 = π 5 π + π 2 + π 3 + π 4 + π 5 = 3

then we have π = π 2 = π 3 = π 4 = and π 5 = ( β) 2 (2α 2 + )( β) 2 + 2α(2 α), α(2 α) (2α 2 + )( β) 2 + 2α(2 α), α( β)(2 α) (2α 2 + )( β) 2 + 2α(2 α), α 2 ( β) 2 (2α 2 + )( β) 2 + 2α(2 α), αβ(2 α) + α 2 ( β) 2 (2α 2 + )( β) 2 + 2α(2 α) 5 (a) P = The probability in question is α(2 α)(2 β) + ( β)2 π + π 2 + π 3 = (2α 2 + )( β) 2 + 2α(2 α) α=0,β=02 = 9508% c c + c + d 0 0 c 0 0 c + 0 0 c c c + d 0 0 c c c c + c + d 0 0 c c (b) P 2 c = P P = 0 0 c c c + 0 0 c + d 0 0 c c + c + d 0 0 P 3 = P 2 c P = 0 0 c + 0 0 = P c c c + d 0 0 c c { P, for k = 2n + ; By Induction, for k N and n Z, P k = P 2, for k = 2n Therefore, 4

6 lim P 2n = n and lim P 2n+ = n c c + c + d 0 0 c c c 0 0 c c c + 0 0 c + d 0 0 c c + c + d 0 0 c 0 0 c + 0 0 c c c + d 0 0 c c E(N t λt N τ λτ) = E(N t N τ + N τ λτ + λτ λt N τ λτ) = E(N t N τ ) + N τ λτ + λτ λt = λ(t τ) + N τ λτ λ(t τ) = N τ λτ {N T λt : t 0} is a martingale 7 (a) P {V t s N u, u t} = P {N t+s N t N u, u t} = P {N t+s N t } = P {N s } = P {N s = 0} = e λs (b) If V t were a stopping time of the Poisson process, we would have {V t s} is an event in the σ-algebra σ{n u ; u s} for all s > 0 Mathematically, this amounts to the following relation: {V t s} σ{n u ; u s}, s > 0 () Each sample point ω in σ{n u ; u s} is a coset (or equivalence class ) by ignoring the information of each Poisson sample particle after time s, so that if two Poisson sample 5

points differ from each other only after time s, both are members of the same coset ω Then, to verify (), we need to answer the following query with a deterministic YES/NO answer For each ω σ{n u ; u s}, is ω {V t s}? (2) We note that {U t s} = {t T Nt s} = {T Nt t s} In other words, the set {U t s} consists of all the sample points whose last arrival prior to t occurred at or after time mark t s Without loss of generality, we assume that t s < s as the above graph shows The other case, however, can be similarly proved Suppose ω σ{n u ; u s} has an arrival occurring between the time interval [t s, s] In this case, regardless what could happen after time s, we are certain that this ω (to be precise, all the members in this coset ω) must have its last arrival prior to t happen after t s The answer to the query (2) is therefore YES for such an ω On the other hand, Suppose ω σ{n u ; u s} does not have an arrival occurring between the time interval [t s, s] Then, some member of ω could be in {T Nt t s} if it has an arrival in (s, t] whereas some member of ω isn t if it doesn t have an arrival in (s, t] Since ω is lack of information after s, both situations are possible As the result, we do not have a definite YES or NO answer to the above query (2) for ω Hence V t is not a stopping time of the Poisson process To check whether V t is a stopping time, we have to verify {V t s} σ{n u ; u s} This can not be true if s < t Given the information only up to s < t, we can really say nothing about what will happen after time t In other words, V t is obviously not a stopping time (c) We first claim that T Nt is not a stopping time If s < t, we have {T Nt s} = {N t N s = 0} In other words, the last arrival before t actually happened before an earlier time s if, and only if there is no arrival in (s, t] However, {N t N s = 0} σ{n u ; u s} since {N t N s = 0} requires information after time s Hence T Nt is not a stopping time On the other hand, T Nt + can be shown to be a stopping time If s t, since T Nt + > t by definition, we have {T Nt + s} = σ{n u ; u s} If s > t, then we can observe the information in (t, s] for each ω in σ{n u ; u s} If an ω is such that it has no arrival on (t, s], this ω can not belong to {T Nt + s} If an ω has at least one arrival on (t, s], that ω belongs to {T Nt+ s} Namely, we have just shown {T Nt+ s} = {N s N t } σ{n u ; u s} The Poisson process paradox does not contradict the strong Markovian property because T Nt+ T Nt is not the difference of two stopping times Therefore, the distribution of T Nt+ T Nt is not necessarily exponential (d) By E[T n+ T n ], we are averaging the inter-arrival time between the n th and the (n + ) th event over all possible sample particles ω On the other hand, given an observation time t, E[T Nt + T Nt ] averages, for each ω, the particular interval [T Nt, T Nt +] that covers t Therefore, there is no paradox After all, we are talking about two different things: E[T n+ T n ] and E[T Nt + T Nt ] In the class, we have seen that the latter is actually larger Why is E[T Nt + T Nt ] larger? In a Poisson process, the inter-arrival times are iid exponentially distributed and the average is /λ More precisely, the length of the time interval [T, T 2 ] is random, depending on ω, and its average E[T 2 T ] = /λ Similarly, the length of the time interval [T 5, T 6 ] is random, depending on ω, but independent of the length [T, T 2 ](ω), obeying again the exponential distribution with an average E[T 6 T 5 ] = 6

/λ In general, we have E[T n+ T n ] = /λ for any n This average is sort of a space average as the average is taken over the state (space) space Do we have a time average for the inter-arrival time here? The answer is positive as the Poisson process is Markovian! Let us consider E[ m m k=0 T k T k ] (3) It is easily computed that the expected value is /λ since each one is /λ Given a sample particle ω, the interpretation for m T k T k (ω) is the average of the first m interarrival times for this ω If we further take the mean (over all possible ω s) of the averaged m k=0 first m inter-arrival time, it is the meaning of (3) Now, let m tend to the infinity and the limit is /λ, which is the long run mean of the inter-arrival times, over all sample particles; and over all inter-arrival times Now, we can interpret why E[T Nt + T Nt ] is longer We first notice that, for any ω, its inter-arrival times are not of uniform lengths, but consist of short and long periods According to (3), the average length of the short and long ones should be /λ over all possible ω s If we make a random observation at time t and at a particular ω and assume that the observation is independent of everything in the Poisson process, it is more likely that we hit those longer ones in ω Similarly, suppose another ω is observed, we again have a better chance to hit those longer intervals in ω In other words, for almost all ω, we are confident that we have always observed a longer interval Consequently, the averaged observed interval should be longer than the averaged inter-arrival times, which is /λ so that we have the following E[T Nt+ T Nt ] > /λ = E[T n+ T n ] On a particular day ω when you are going to ride a bus whose arrival process is Poisson, for the majority of the passengers they will experience that they hit the longer inter-arrival periods of the day As a result, upon your arrival (think of you belonging to the majority unless you get lucky), regardless when the previous bus had left the station, the expected remaining time for the next bus is still /λ That is, if the information says that the bus is served on a 5 minutes basis, no matter when you arrive the station, you d be prepared to wait for 5 minutes on average 7