Readings: Finish Section 5.2

Similar documents
Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Markov Chains (Part 3)

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

MARKOV PROCESSES. Valerio Di Valerio

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Statistics 150: Spring 2007

Performance Evaluation of Queuing Systems

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Probability, Random Processes and Inference

LECTURE #6 BIRTH-DEATH PROCESS

ISM206 Lecture, May 12, 2005 Markov Chain

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

At the boundary states, we take the same rules except we forbid leaving the state space, so,.

Queuing Theory. Using the Math. Management Science

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

GI/M/1 and GI/M/m queuing systems

CS 798: Homework Assignment 3 (Queueing Theory)

Stochastic Models: Markov Chains and their Generalizations

Name of the Student:

QUEUING SYSTEM. Yetunde Folajimi, PhD

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

An Introduction to Stochastic Modeling

Random Walk on a Graph

Lecture 20: Reversible Processes and Queues

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

PBW 654 Applied Statistics - I Urban Operations Research

1 Gambler s Ruin Problem

Math 597/697: Solution 5

Contents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

IE 5112 Final Exam 2010

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Markov Chains. Contents

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Time Reversibility and Burke s Theorem

Advanced Queueing Theory

λ λ λ In-class problems

Modelling Complex Queuing Situations with Markov Processes

Stochastic process. X, a series of random variables indexed by t

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

56:171 Operations Research Final Examination December 15, 1998

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Computer Systems Modelling

Mathematical Methods for Computer Science

Chapter 3: Markov Processes First hitting times

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Lecture 4 - Random walk, ruin problems and random processes

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Lectures on Markov Chains

Continuous-Time Markov Chain

Eleventh Problem Assignment

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

56:171 Operations Research Final Exam December 12, 1994

2. Transience and Recurrence

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

QUEUING MODELS AND MARKOV PROCESSES

1 Gambler s Ruin Problem

1/2 1/2 1/4 1/4 8 1/2 1/2 1/2 1/2 8 1/2 6 P =

Session-Based Queueing Systems

CS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions

Introduction to queuing theory

Data analysis and stochastic modeling

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Answers to selected exercises

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

Introduction to Queuing Networks Solutions to Problem Sheet 3

Queuing Analysis. Chapter Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall

Probability and Statistics Concepts

M/G/1 and M/G/1/K systems

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Bernoulli Counting Process with p=0.1

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Chapter 4 Markov Chains at Equilibrium

NUMERICAL ANALYSIS PROBLEMS

Math Homework 5 Solutions

Introduction to Markov Chains, Queuing Theory, and Network Performance

1 Ways to Describe a Stochastic Process

5/15/18. Operations Research: An Introduction Hamdy A. Taha. Copyright 2011, 2007 by Pearson Education, Inc. All rights reserved.

Operations Research Letters. Instability of FIFO in a simple queueing system with arbitrarily low loads

Classification of Queuing Models

The Transition Probability Function P ij (t)

IEOR 6711, HMWK 5, Professor Sigman

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

A FAST MATRIX-ANALYTIC APPROXIMATION FOR THE TWO CLASS GI/G/1 NON-PREEMPTIVE PRIORITY QUEUE

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

1 Random Walks and Electrical Networks

Engineering Mathematics : Probability & Queueing Theory SUBJECT CODE : MA 2262 X find the minimum value of c.

Quiz Queue II. III. ( ) ( ) =1.3333

Transcription:

LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states.

Example: Checkout Counter Discrete time Customer arrivals: Bernoulli( ) Geometric interarrival times. Customer service times: Geometric( ) State : number of customers at time. 0 1 2 3 9 10

Finite State Markov Models : state after transitions Belongs to a finite set, e.g. is either given or random. Markov Property / Assumption: Given the current state, the past does not matter. Modeling steps: Identify the possible states. Mark the possible transitions. Record the transition probabilities.

-step Transition Probabilities State occupancy probabilities, given initial state : Time 0 Time n-1 Time n i l k j m Key recursion: Random initial state:

Example 1 2

Generic Question Does converge to something? 1 2 3 Does the limit depend on the initial state? 1 2 3 4

Recurrent and Transient States State is recurrent if: Starting from, and from wherever you can go, there is a way of returning to. 5 3 4 6 7 1 2 8 If not recurrent, a state is called transient. If is transient then as. State is visited only a finite number of times. Recurrent Class: Collection of recurrent states that communicate to each other, and to no other state.

Periodic States The states in a recurrent class are periodic if: They can be grouped into groups so that all transitions from one group lead to the next group. 4 3 2 8 1 5 6 7 In this case, cannot converge.

LECTURE 20 Readings: Section 6.3 Lecture outline Markov Processes II Markov process review. Steady-state behavior. Birth-death processes.

Review Discrete state, discrete time, time-homogeneous Transition probabilities Markov property State occupancy probabilities, given initial state : Key recursion:

Recurrent and Transient States State is recurrent if: Starting from, and from wherever you can go, there is a way of returning to. If not recurrent, a state is called transient. If is transient then as. State is visited a finite number of times. Recurrent Class: Collection of recurrent states that communicate to each other, and to no other state. 9 3 4 6 7 5 1 2 8

Periodic States The states in a recurrent class are periodic if: They can be grouped into groups so that all transitions from one group lead to the next group. 4 5 6 9 3 1 2 7 8

Steady-State Probabilities Do the converge to some? (independent of the initial state ) Yes, if: Recurrent states are all in a single class, and No periodicity. Start from key recursion: Take the limit as : Additional equation:

Example 1 2

Example 1 2 Assume process starts at state 1.

Visit Frequency Interpretation (Long run) frequency of being in : Frequency of transitions : Frequency of transitions into : l k j m

Random Walk (1) A person walks between two ( -spaced) walls: To the right with probability To the left with probability Pushes against the walls with the same probabilities. 0 1 2 m Locally, we have: i i + 1 Balance equations:

Justification: Random Walk (2)

Random Walk (3) Define: Then: To get, use:

Birth-Death Process (1) General (state-varying) case: 0 1 2 3 m Locally, we have: i i + 1 Balance equations: Why? (More powerful, e.g. queues, etc.)

Birth-Death Process (2) Special case: and for all and, again, define (called load factor ). Less general (but more so than the random walk). Assume and (in steady-state)

LECTURE 21 Readings: Section 6.4 Lecture outline Markov Processes III Review of steady-state behavior Queuing applications Calculating absorption probabilities Calculating expected time to absorption

Review Assume a single class of recurrent states, aperiodic. Then, where does not depend on the initial conditions. can be found as the unique solution of the balance equations: together with

General case: Birth-Death Process 0 1 2 3 N Locally, we have: i i + 1 Balance equations: Why? (More powerful, e.g. queues, etc.)

M/M/1 Queue (1) Poisson arrivals with rate Exponential service time with rate server Maximum capacity of the system = Discrete time intervals of (small) length : 0 1 i-1 i N-1 N Balance equations: Identical solution to the random walk problem.

M/M/1 Queue (2) Define: Then: To get, use: Consider 2 cases!

The Phone Company Problem (1) Poisson arrivals (calls) with rate Exponential service time (call duration), rate servers (number of lines) Maximum capacity of the system = Discrete time intervals of (small) length : 0 1 i-1 i N-1 N Balance equations: Solve to get:

The Phone Company Problem (2) 0 1 i-1 i N-1 N Balance equations: Solution: Consider the limiting behavior as. Therefore: (Poisson)

M/M/m Queue Poisson arrivals with rate Exponential service time with rate servers Maximum capacity of the system = Discrete time intervals of (small) length : 0 1 i-1 i m j-1 j N-1 N Balance equations:

Gambler s Ruin (1) Each round, Charles Barkley wins 1 thousand dollars with probability and looses 1 thousand dollars with probability Casino capital is equal to He claims he does not have a gambling problem! 0 1 2 m Both and are absorbing!

Calculating Absorption Probabilities Each state is either transient or absorbing Let be one absorbing state Definition: Let be the probability that the state will eventually end up in given that the chain starts in state For For For all other :

Gambler s Ruin (2) 0 1 2 m

Expected Time to Absorption 3 1 2 4 What is the expected number of transitions until the process reaches the absorbing state, given that the initial state is? For all other :