Examination paper for TMA4265 Stochastic Processes

Similar documents
TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Examination paper for TMA4285 Time Series Models

Examination paper for Solution: TMA4285 Time series models

The Transition Probability Function P ij (t)

Examination paper for TMA4245 Statistikk

Continuous-Time Markov Chain

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

EXAM TMA4240 STATISTICS Thursday 20 Dec 2012 Tid: 09:00 13:00

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Statistics 150: Spring 2007

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Examination paper for TMA4255 Applied statistics

Stochastic process. X, a series of random variables indexed by t

Name of the Student:

Continuous Time Markov Chains

Quantitative Model Checking (QMC) - SS12

STAT 380 Continuous Time Markov Chains

Lecture 4a: Continuous-Time Markov Chain Models

Markov Processes Cont d. Kolmogorov Differential Equations

Continuous time Markov chains

STAT STOCHASTIC PROCESSES. Contents

Time Reversibility and Burke s Theorem

TMA 4265 Stochastic Processes

Examination paper for TMA4110/TMA4115 Matematikk 3

Performance Evaluation of Queuing Systems

Examination paper for TMA4110 Matematikk 3

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Data analysis and stochastic modeling

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Lecture 20: Reversible Processes and Queues

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

Introduction to Queuing Networks Solutions to Problem Sheet 3

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Part I Stochastic variables and Markov chains

SMSTC (2007/08) Probability.

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

IEOR 6711, HMWK 5, Professor Sigman

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Figure 10.1: Recording when the event E occurs

T. Liggett Mathematics 171 Final Exam June 8, 2011

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

A REVIEW AND APPLICATION OF HIDDEN MARKOV MODELS AND DOUBLE CHAIN MARKOV MODELS

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Markov processes and queueing networks

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

TMA4265: Stochastic Processes

Probability and Statistics Concepts

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Engineering Mathematics : Probability & Queueing Theory SUBJECT CODE : MA 2262 X find the minimum value of c.

MARKOV PROCESSES. Valerio Di Valerio

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

1 IEOR 4701: Continuous-Time Markov Chains

The exponential distribution and the Poisson process

LECTURE #6 BIRTH-DEATH PROCESS

(implicitly assuming time-homogeneity from here on)

Statistics 433 Practice Final Exam: Cover Sheet and Marking Sheet

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Stochastic Models in Computer Science A Tutorial

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Examination paper for TMA4145 Linear Methods

THE QUEEN S UNIVERSITY OF BELFAST

Question Points Score Total: 70

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Stochastic Processes

TMA4265: Stochastic Processes

STOCHASTIC PROCESSES Basic notions

Introduction to queuing theory

Computer Systems Modelling

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Lecture 20 : Markov Chains

A review of Continuous Time MC STA 624, Spring 2015

Readings: Finish Section 5.2

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Non Markovian Queues (contd.)

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Cover Page. The handle holds various files of this Leiden University dissertation

CS 798: Homework Assignment 3 (Queueing Theory)

Treball final de grau GRAU DE MATEMÀTIQUES Facultat de Matemàtiques Universitat de Barcelona MARKOV CHAINS

Exponential Distribution and Poisson Process

Interlude: Practice Final

1.225J J (ESD 205) Transportation Flow Systems

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS

Examination paper for TMA4275 Lifetime Analysis

Transcription:

Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination time (from to): 09:00 13:00 Permitted examination support material: C: Calculator CITIZEN SR-270X, CITIZEN SR-270X College, HP30S, Casio fx-82es PLUS with empty memory. Tabeller og formler i statistikk, Tapir forlag. K. Rottmann: Matematisk formelsamling. Bilingual dictionary. One yellow, stamped A5 sheet with own handwritten formulas and notes (on both sides). Other information: Note that all answers must be justified. In your solution you can use English and/or Norwegian. All ten subproblems are approximately equally weighted. Language: English Number of pages: 3 Number of pages enclosed: 3 Checked by: Date Signature

TMA4265 Stochastic Processes, December 14th, 2015 Page 1 of 3 Problem 1 On any given day Gary is either cheerful (0), so-so (1), or glum (2). Let X n denote Gary s mood on the nth day and assume {X n, n 0} is a Markov chain with transition probability matrix P = 0 1 2 0 0.4 x 0 1 0.3 0.6 y 2 0 0.45 z a) Find x, y and z. Derive P (X 3 = 0 X 2 = 0, X 1 0, X 0 = 0). Assume Gary s mood is so-so. What is the probability that when Gary changes his mood, he changes to glum rather than cheerful? b) Determine the long-run proportion of time that Gary is in a glum mood. Problem 2 A coin is flipped repeatedly and independently. The probability to get head (H) is denoted by p while the probability to get tail (T) is q = 1 p. a) Use a Markov chain formulation and first-step analysis to derive the expected number of flips made until the pattern H, T, H appears for the first time. Problem 3 a) Give an example of a discrete-time Markov chain with state space S = {0, 1, 2}, that is irreducible and ergodic but not time-reversible. Explain why your example is a valid choice, i.e. why it fulfills all mentioned criteria.

Page 2 of 3 TMA4265 Stochastic Processes, December 14th, 2015 Problem 4 David goes fishing and catches fish according to a Poisson process with rate 2 per hour. a) What is the probability that he catches at least 2 fishes within the first half an hour? Assume David caught exactly 3 fishes in the first hour, when is he expected to catch the 9th fish? Suppose further that each catch is a salmon with probability p = 0.4, and a trout with probability q = 1 p = 0.6. Whether a catch is a salmon or a trout is independent of all other catches. b) Derive the probability that David will catch exactly 1 salmon and 2 trouts in a given 2 hour and 30 minute period. Assume David is fishing together with two friends. Each of them also catches fish (independently of the others) according to a Poisson process with rate 2. c) What is the expected time until everyone has caught at least one fish? Problem 5 A tourist guide is stationed at a port and hired to give sightseeing tours with his boat. If the guide is free, it takes an interested tourist a certain time, exponentially distributed with mean 1/µ 1, to negotiate the price and type of tour. Assume there is a probability α that there is no agreement and the tourist leaves. If the tourist and the guide reach an agreement, they start directly the tour. The duration of the tour is exponentially distributed with mean 1/µ 2. Suppose that interested tourists arrive at the guide s station according to a Poisson process with rate λ and request service only when the guide is free, i.e. the guide is neither negotiating nor on tour with another tourist. Suppose further that the Poisson process, the negotiation time, the duration of the tour and whether the tourist and the guide achieve an agreement are independent. a) Set up the set of balance equations and derive the proportion of time (in the long run) that the guide is free. b) Find the transition probability matrix of the embedded discrete-time Markov chain. Determine the period for all states of this Markov chain.

TMA4265 Stochastic Processes, December 14th, 2015 Page 3 of 3 Problem 6 Assume you are able to only simulate standard normal distributed random variables. a) Give an algorithm for simulating a Brownian motion process with drift coefficient µ and variance parameter σ 2 at times 0 = t 0 < t 1 < t 2 < < t k.

TMA4265 Stochastic Processes, December 14th, 2015 Page i of iii Formulas for TMA4265 Stochastic Processes: The law of total probability Let B 1, B 2,... be pairwise disjoint events with P ( i=1 B i) = 1. Then P (A C) = P (A B i C)P (B i C), i=1 E[X C] = E[X B i C]P (B i C). i=1 Discrete time Markov chains Chapman-Kolmogorov equations P (m+n) ij = k=0 P (m) ik P (n) kj. For an irreducible and ergodic Markov chain, π j = lim n P (n) ij exist and is given by the equations π j = π i P ij and π i = 1. i i For transient states i, j and k, the expected time spent in state j given start in state i, s ij, is s ij = δ ij + P ik s kj. k For transient states i and j, the probability of ever returning to state j given start in state i, f ij, is f ij = (s ij δ ij )/s jj. The Poisson process The waiting time to the n-th event (the n-th arrival time), S n, has the probability density f Sn (t) = λn t n 1 (n 1)! e λt for t 0. Given that the number of events N(t) = n, the arrival times S 1, S 2,..., S n have the joint probability density f S1,S 2,...,S n N(t)(s 1, s 2,..., s n n) = n! t n for 0 < s 1 < s 2 <... < s n t.

Page ii of iii TMA4265 Stochastic Processes, December 14th, 2015 Markov processes in continuous time A (homogeneous) Markov process X(t), 0 t, with state space Ω Z + = {0, 1, 2,...}, is called a birth and death process if P i,i+1 (h) = λ i h + o(h) P i,i 1 (h) = µ i h + o(h) P i,i (h) = 1 (λ i + µ i )h + o(h) P ij (h) = o(h) for j i 2 where P ij (s) = P (X(t + s) = j X(t) = i), i, j Z +, λ i 0 are birth rates, µ i 0 are death rates. The Chapman-Kolmogorov equations P ij (t + s) = P ik (t)p kj (s). k=0 Limit relations 1 P ii (h) P ij (h) lim = v i, lim = q ij, i j h 0 h h 0 h Kolmogorov s forward equations P ij(t) = k j q kj P ik (t) v j P ij (t). Kolmogorov s backward equations P ij(t) = k i q ik P kj (t) v i P ij (t). If P j = lim t P ij (t) exist, P j are given by v j P j = k j q kj P k and P j = 1. j In particular, for birth and death processes P 0 = 1 k=0 θ k and P k = θ k P 0 for k = 1, 2,... where θ 0 = 1 and θ k = λ 0λ 1... λ k 1 µ 1 µ 2... µ k for k = 1, 2,...

TMA4265 Stochastic Processes, December 14th, 2015 Page iii of iii Queueing theory For the average number of customers in the system L, in the queue L Q ; the average amount of time a customer spends in the system W, in the queue W Q ; the service time S; the average remaining time (or work) in the system V, and the arrival rate λ a, the following relations obtain L = λ a W. L Q = λ a W Q. V = λ a E[SW Q] + λ a E[S 2 ]/2. Some mathematical series n k=0 a k = 1 an+1 1 a, ka k = k=0 a (1 a) 2.