2905 Queueing Theory and Simulation PART IV: SIMULATION
|
|
- Bertram Rogers
- 6 years ago
- Views:
Transcription
1 2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random variable uniformly distributed on (0, 1) Pseudo-random Number Generation Subroutines for generating random numbers are called random-number generators and are generally included in scientific packages or simulation programming languages. Most of these random-number generators use algorithms that produce a deterministic string of numbers that have all the appearances of being independent uniform (0, 1) random variables. One of the most common approaches to generating pseudorandom numbers starts with an initial value x 0, called the seed, and then recursively computes successive values x n, n 1, by letting x n = (ax n 1 + c) mod m (eg. m = a = 7 5 ) (1) where a, c and m are given positive integers. Each x n is 0, 1,..., m 1 and the quantity x n m, called a pseudorandom number is taken as an approximation to an uniform (0, 1) random variable. 149
2 This approach specified by equation (1) is called congruential method if c 0. If c = 0, then it is called the multiplicative congruential method. Example: Use the mixed congruential method to generate a sequence of pseudorandom numbers R 1, R 2,... with x 0 = 27, a = 17, c = 43, and m = 100. Solution: x 0 = 27 x 1 = [(17)(27) + 43] mod 100 = 502 mod 100 = 2 R 1 = = 0.02 x 2 = [(17)(2) + 43] mod 100 = 77 mod 100 = 77 R 2 = = We shall not pursue further the question of how to construct good random number generators. Rather, we assume in conducting computer simulation of systems, that we have a black box that gives a random number on request. 150
3 22.2 Using Random Numbers to Evaluate Definite Integrals One of the earliest applications of random numbers was in the computation of integrals. Let f(x) be a function and suppose we wanted to compute the integral I = If U is uniformly distributed over (0, 1) then 1 0 f(x)dx. I = E(f(U)). One of the applications of the random numbers is the computation of definite integrals. Suppose we want to evaluate the integral I = 1 0 f(x)dx. To obtain I we note that in fact I = E(f(U)) where U is the random variable which uniformly distributed over (0, 1). To estimate E(f(U)), suppose U 1, U 2,..., U n are independent random variables uniformly distributed in (0, 1) then f(u 1 ), f(u 2 ),..., f(u n ) 151
4 are independent and identically distributed random variables having mean I. Therefore by Strong Law of Large Numbers, with probability one, n f(u i ) lim n n = I. This means that when n is large I n n i=1 i=1 f(u i ) n will be close to I. Example: Find an approximation of the definite integral I = 1 0 sin(x)dx by using simulation method discussed. The true value of I is The following table shows the results for different number of simulation n. n I n Table 22.1 The Approximation of I for Different n. 152
5 Let us extend the idea of simulation to evaluate I = b a f(x)dx. To apply the result, we perform the following substitution Therefore How about the case when y = dy = dx/(b a) and I = I = 0 (x a) (b a). 1 0 f(a + [b a]y)(b a)dy. f(x)dx? We can apply the substitution Therefore we have and y = (1 + x) 1. dy = dx/(x + 1) 2 = y 2 dx I = 1 0 f( 1 y 1) dy. y 2 153
6 23 Generate Discrete Random Variables (The Inverse Transform Method) Suppose we wanted to generate the value of a discrete random variable X having probability mass function P {X = x j } = p j, j = 0, 1,..., p j = 1. j We generate a random number U and set x 0 if U < p 0 x 1 if p 0 U < p 0 + p 1.. X = j 1 x j if p i U < j i=0 i=0.. since, for we have 0 < a < b < 1, P {a U < b} = b a, P {X = x j } = P and so X has the desired distribution. { j 1 i=1 p i U < p i } j p i = p j i=1 154
7 This method can be written algorithmically as Generate a random number U. If U < p 0 set X = x 0 and stop. If U < p 0 + p 1 set X = x 1 and stop. If U < p 0 + p 1 + p 2 set X = x 2 and stop.. If the x i, i 0 are ordered so that x 0 < x 1 < and if we let F denote the distribution function of x, then j F (x j ) = and so i=0 X = x j if F (x j 1 ) U < F (x j ). p i 155
8 . p 0 + p 1 p 0 x 0 x 1 Figure x 2 x 3 In other words, after generating a random number U we determine the value of X by finding the interval (F (x j 1 ), F (x j )) in which U lies [or, equivalently finding the inverse of F (U)]. Thus this method is called the discrete inverse transform method for generating X. In general, after generating U, one has to go through a table-lookup procedure in order to find X. However, for some cases, one can determine X by an algebraic formula. 156
9 Example: Suppose we want to generate the value of X such that Using the inverse transform method, P {X = j} = 1, j = 1, 2,..., n. n or X = j if j 1 n U < j, j = 1, 2,..., n. n X = nu + 1 where x (called the floor of x) is the greatest integer less than or equal to x. In this example, we have an algebraic formula for determining X in terms of U. 157
10 Example: (Calculating Averages) Suppose we want to approximate n a(i) a = n where n is large and the values a(i), i = 1, 2,..., n are complicated and not easily calculated. Let X be a discrete uniform random variable such that Then E[a(X)] = i=1 P {X = j} = 1, j = 1, 2,..., n. n n a(i)p {X = i} = i=1 n i=1 a(i) n = a. Hence if we generate k random numbers U i and setting X i = nu i + 1, then each of the k random variables a(x i ) will have mean a and so by the strong law of large numbers it follows that when k is large, k a(x i ) a with probability 1. k i=1 158
11 Example: Consider the Geometric random variable X where Since P {X = i} = pq i 1, i 1 and q = 1 p. j 1 i=1 P {X = i} = 1 q j 1, j 1, we can generate the value of X by generating a random number U and setting X equal to that value j for which 1 q j 1 U < 1 q j or, equivalently, for which { X = Min{j : q j < 1 U} = Min{j : j log q < log(1 U)} = Min j : j > log(1 U) }. log q We note that log q and log(1 U) are both negative. Hence log(1 U) X = + 1. (2) log q Since 1 U is also uniformly distributed on (0, 1), Eq. (2) is equivalent to log U X = + 1. log q Thus we again have an algebraic formula for determining X. 159
12 Example: Generating a Poisson random variable. Let X be a Poisson random variable with probability mass function By noting that p i = P {X = i} = e λλi i!, i = 0, 1,.... p i+1 = λ i + 1 p i, i 0, we can write the inverse transform method for generating X as follows: (where F = F (i) = P {X i}) Step 1: Generate a random number U. Step 2: i = 0,p = e λ, F = p. Step 3: If U < F, set X = i and stop. Step 4: p = λp/(i + 1), F = F + p, i = i + 1. Step 5: Go to Step 3. The algorithm successively checks if the Poisson value is 0, then if it is 1, then 2, and so on. Thus the number of comparisons needed will be 1 greater than the generated value of the Poisson. Hence on average, the above algorithm needs to make (1 + λ) searches. 160
13 Example: Generating Binomial random variables. Suppose X is a binomial random variable with probability mass function P {X = i} = n! i!(n i)! pi (1 p) n i, i = 0, 1,..., n. Again we apply the inverse transform method by making use of the recursive relation ( ) ( ) n i p P {X = i + 1} = P {X = i}. i p Step 1: Generate a random number U. Step 2: c = p/(1 p), i = 0, r = (1 p) n, F = r. Step 3: If U < F, set X = i and stop. Step 4: r = [c(n i)/(i + 1)]r, F = F + r,i = i + 1. Step 5: Go to Step 3. In the algorithm with i denoting the value currently under consideration, r = P {X = i} and F = F (i) = P {X i}. The number of searches required is the value of X plus 1. Since E[X] = np, on average, it will take 1 + np searches to generate X. 161
14 24 Generate Discrete Random Variables (The Acceptance-Rejection Technique) Suppose we have an efficient method for simulating a random variable Y having probability mass function {q j, j 0}. We can use this as the basis for simulating another random variable X with probability mass function {p j, j 0}. Let c be a constant such that p j q j c for all j such that p j > 0. Rejection Method (acceptance-rejection method) Step 1: Simulate the value of Y, having probability mass function q j. Step 2: Generate a random number U. Step 3: If U < p Y /cq Y, set X = Y and stop. Otherwise, return to Step 1. Proposition 1 The acceptance-rejection method generates a random variable X such that P {X = j} = p j, j = 0, 1,.... In addition, the number of iterations of the algorithm needed to obtain X is a geometric random variable with mean c. 162
15 Proof: j. Summing over j, Let us first determine the probability that a single iteration produces the accepted value P {Y = j, it is accepted} = P {Y = j}p {accepted Y = j} = q j p j cq j = p j c. P {accepted} = j p j c = 1 c. As each iteration independently results in an accepted value with probability 1 c, we see that the number of iterations needed is geometric with mean c. Also, P {X = j} = n = n P {j accepted on iteration} ( 1 1 ) n 1 p j c c = p j. Remarks: (i) Since j p j c j q j and j p j = j q j = 1, we have c 1. (ii) The number of iterations is smaller if c is smaller. 163
16 Example: Suppose we wanted to generate a random variable X that takes one of the values 1, 2,..., 10 with respective probabilities 0.11, 0.12, 0.09, 0.08, 0.10, 0.08, 0.10, 0.09, 0.11, One way is to use the inverse transform method. However, here let us use the rejection method with Y being the random variable with the uniform discrete density q j = P {Y = j} = 1 10, j = 1, 2,..., 10. Let p j = P {X = j}. Then and the algorithm is given as follows: c = max { pj q j } = 1.2 Step 1: Generate a random number U 1 and set Y = 10U Step 2: Generate a second number U 2. Step 3: If U 2 p Y.12, set X = Y and Stop. Otherwise return to Step
17 24.1 The Composition Approach Suppose we have efficient methods for generating random variables X 1 and X 2 with probability mass functions {p (1) j, j 0} and {p (2) j, j 0} respectively, and we want to generate a random variable X having mass function where 0 < α < 1. The composition algorithm is as follows: P {X = j} = αp (1) j + (1 α)p (2) j, j 0 Step 1: Generate X 1 with mass function {p (1) j, j 0}. Step 2: Generate X 2 with mass function {p (2) j, j 0}. Step 3: Generate a random number U. Step 4: If U < α, set X = X 1. Otherwise, set X = X 2. We can prove that the algorithm gives the correct value by observing P {X = j} = P {U < α and X 1 = j} + P {U α and X 2 = j} = P {U < α}p {X 1 = j} + P {U α}p {X 2 = j} = αp (1) j + (1 α)p (2) j. 165
18 25 Generating Continuous Random Variables (Inverse Transform Algorithm) The inverse transform algorithm for generating continuous random variable is based on the following proposition: Proposition 2 Let U be a uniform (0, 1) random variable. For any continuous distribution function F the random variable X defined by has distribution F. X = F 1 (U) Proof: Let F X denote the distribution function of X = F 1 (U). Then F X (x) = P {X x} = P {F 1 (U) x}. Since F is a monotonic increasing function of x and so F 1 (U) x is equivalent to or F (F 1 (U)) F (x), F X (x) = P {F (F 1 (U)) F (x)} = P {U F (x)} = F (x). 166
19 Example: Let X be an exponential random variable with mean 1 λ and its cummulative probability density function If we let then F (x) = x 0 λe λt dt = 1 e λx. x = F 1 (u), u = F (x) = 1 e λx or x = 1 log(1 u). λ Hence we can generate the exponential random variable by generating a random number U and then setting X = F 1 (U) = 1 log(1 U). λ Since U 1 U (i.e. they have the same distribution), equivalently we can write X = 1 λ log(u). 167
20 Example: (Another algorithm for generating a Poisson random variable) Recall that a Poisson process with rate λ results when the time between successive events are independent exponential random variables with mean 1 λ. For such a process N(1), the number of events by time 1, is Poisson distributed with mean λ. However, if we let X i, i = 1, 2,... denote the successive inter-arrival times, then the nth event will occur at time n X i, and so i=1 { N(1) = Max n : n i=1 } X i 1. Using the result above, { N(1) = Max n : { = Max n : } n 1 λ log U i 1 } n log U i λ i=1 i=1 = Max { n : U 1 U n e λ}. (where U 1, U 2,... are random numbers) 168
21 Based on this result, we have the following algorithm for generating a Poisson random variable N with rate λ: Step 1: N = 0, P = 1. Step 2: Generate random number U. Step 3: Set P = P U. Step 4: If P < e λ, stop. Otherwise, set N = N + 1 and go to Step
22 Example: Suppose we want to generate an n-phase Erlangian random variable X. Since the distribution function is given by F (x) = 1 n 1 i=0 (λx) i e λx. i! It is not possible to give a closed form expression for its inverse. However, X is the sum of n independent exponential random variable, each with mean λ 1. We can use the result of previous example to generate X. Specifically, we generate n random numbers U 1, U 2,..., U n and then set X = 1 λ log U 1 1 λ log U n = 1 λ log(u 1 U n ). 170
23 26 Generating Continuous Random Variables (The Rejection Method) Suppose we have a method for generating a random variable Y having density function g(x). We can use this as basis for generating a random variable X having density function f(x). Let c be a constant such that The Rejection Method Step 1: Generate Y having density g. f(y) g(y) c for all y. Step 2: Generate a random number U. Step 3: If U f(y ) cg(y ), set X = Y. Otherwise, return to Step 1. The method is similar to the rejection method for discrete random variables except that density function replaces mass function. As in the discrete case, we can prove the following result. Proposition 3 The random variable generated by the rejection method has density f. and the number of iterations of the algorithm that are needed is a geometric random variable with mean c. 171
24 Example Let us use the rejection method to generate a random variable having density function Consider g(x) = 1, 0 < x < 1. f(x) = 20x(1 x) 3, 0 < x < 1. To determine c such that f(x)/g(x) c, we use calculus to determine the maximum value of f(x) g(x) d ( f(x) ) dx g(x) = 20x(1 x) 3 = 20[(1 x) 3 3x(1 x) 2 ]. The derivative is zero when x = 1/4, positive when x < 1 4 and negative when x > 1 4. ( ) f(x) Hence max occurs at x = 1/4 and thus 0 x 1 g(x) f(x) g(x) f(1 4 ) ( 1 3 ) 3 g( 4)( 1 4 ) = 20 4 = c. 172
25 Hence f(x) cg(x) = x(1 x)3. Hence the rejection method is as follows: Step 1: Generate random numbers U 1 and U 2 Step 2: If U U 1(1 U 1 ) 3, stop and set X = U 1. otherwise, return to Step 1. The average number of iterations that will be performed is c =
26 Example: Suppose we want to generate a random variable having the density f(x) = Kx 1/2 e x, x > 0 where K = 2/ π. Let us try the rejection method with an exponential random variable with mean λ 1. Hence let g(x) = λe λx, x > 0. Now and We have d ( f(x) ) dx g(x) = K λ d ( f(x) ) dx g(x) or when x = 1 2(1 λ), provided that λ < 1. f(x) g(x) = K λ x1/2 e (1 λ)x, [ 1 2 x 1 2e (1 λ)x (1 λ)x 1 2e (1 λ)x ]. = 0 when 1 2 x 1 2 = (1 λ)x 1 2 Moreover d 2 dx 2(f(x) g(x) ) x= 2(1 λ) 1 = Ke 1/2 < 0. 2λ(2(1 λ)) 1/2 174
27 Hence Algorithm: ( f(x) ) c = Max g(x) = K λ 1 [2(1 λ)] 1/2e 1 2. Step 1: Generate a random number U 1 and set Y = 1 λ log U 1. Step 2: Generate a random number U 2. Step 3: If U 2 < f(y ) cg(y ), set X = Y. Otherwise, return to Step 1. The average number of iterations that will be needed is c. Since c depends on λ, we can try to minimize c by choosing suitable λ. We differentiate c with respect λ, [ c (λ) = Ke λ 2 2 1/2 (1 λ) ( 1 )]. 1/2 λ 2 1/2 (1 λ) 3/2 2 c (λ) = 0 when 1 λ = λ or λ = Hence c = 3K ( 3 1/2e 2 2) 1/2 = 33/2 (2πe) 1/2. Since c is minimized, the number of iterations is minimized also. 175
28 Example: Generating a Normal Random Variable. Let Z be the standard normal random variable, and let X = Z. The density function of X is f(x) = 2 2π e x2 2, 0 < x <. We first generate X by using the rejection method with g being the exponential density function with mean 1, that is, g(x) = e x, 0 < x <. Now f(x) 2 g(x) = π ex x2 2 and so the maximum value of f(x)/g(x) occurs at the value of x that maximizes x x 2 /2. This occurs at x = 1 by simple calculus. So we can take { } f(x) c = max = f(1) x g(x) g(1) = Hence f(x) { cg(x) = exp 2e π. (x } 1)
29 Algorithm for generating the absolute value of a standard normal random variable: Step 1: Generate Y, an exponential random variable with mean 1. Step 2: Generate a random number U. (Y 1)2 Step 3: If U exp{ 2 }, set X = Y. Otherwise, return to Step 1. Once we have simulated the random variable X, we can generate the standard normal random variable Z by letting Z be equally likely to be either X or X. This can be written algorithmically as follows: Step 1: Generate X. Step 2: Generate a random number U. Step 3: If U 1 2, set Z = X. Otherwise set Z = X. 177
30 26.1 Generating Poisson Processes Suppose we wanted to generate the first n event times of a Poisson process with rate λ. Since the times between successive events are independent exponential random variables each with mean λ 1, one way to generate the process is as follows: Step 1: i = 1 and t = 0. Step 2: Generate U and set t = t 1 λ log U. Step 3: Set S(i) = t and i = i + 1. Step 4: If i > n, stop. Otherwise, return to Step 2. The values {S(i)} i=1,2,...,n will be the first n event times. If we wanted to generate the first T time units of the Poisson process, then the algorithm can be written as follows: Step 1: t = 0, I = 0. Step 2: Generate a random number U. Step 3: t = t 1 λ log U. If t > T, stop. Step 4: I = I + 1, S(I) = t. Step 5: Go to Step 2. The final value of I will represent the number of events that occur by time T. 178
31 27 Simulations of Markovian Queues Consider an one-server queuing system of unlimited waiting space, the arrival of customers is a Poisson process with the mean inter-arrival time λ 1 being 10 minutes. The exponential server has a mean service time µ 1 of 6 minutes. We are going to simulate this queuing system. The key step in the simulation is to generate two sequences of exponentially distributed random variable a i (for the inter-arrival process of customers) and s i (service time) having mean 10 and 6 respectively. Recall that 1 log(1 U) λ is an exponentially distributed random variable with parameter λ when U is an uniformly distributed random variable on (0, 1). We determine the length of simulation to be 90 minutes. The following inter-arrival time a i are generated: 8, 5, 8, 12, 7, 8, 9, 8, 9, 5, the last customer arrives at the 79th minute. The service time of each customer s i is generated as follows: 7, 8, 4, 6, 7, 6, 5, 4, 8,
32 One may construct the following table. The arrival time, time into service and departure time are the time when the customer arrives at the system, the time when the customer starts his service and the time when the customer leaves the system respectively. We note that the time into service of a customer will be the maximum of his arrival time and the departure time of the previous customer. The queuing time is the time that the customer spent in queuing. The time of a customer spent in the system is the sum of queuing time and service time. In our the case, the average queuing time of each customer is ( )/10 = 0.7 minute and the average time of each customer spent in the system is ( )/10 = 6.7 minutes. Finally by applying the Little s formula, the expected number of customers in the system is equal to the λw c = (10 1 ) 6.7 = The divisor is 90 because the simulation extends over 87 minutes. The theoretical value of the expected number of customers is given by ρ 1 ρ = = 1.5 where ρ = λ µ = 10 1 = The large deviation can be explained by the random error caused by short simulation time. 180
33 Customer Arrival Time into Service Departure Queuing Time in Number Time Service time Time Time System Table
34 A Summary on Simulation Generation of random numbers/uniformly distributed r.v. by using the congruential method. Approximation of definite integral by random numbers. Inverse transformation method for discrete and continuous r.v.. Acceptance and rejection method for discrete and continuous r.v.. Generation of Poisson and Exponential r.v. 182
Ch3. Generating Random Variates with Non-Uniform Distributions
ST4231, Semester I, 2003-2004 Ch3. Generating Random Variates with Non-Uniform Distributions This chapter mainly focuses on methods for generating non-uniform random numbers based on the built-in standard
More information2 Random Variable Generation
2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods
More informationProbability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More information1 Acceptance-Rejection Method
Copyright c 2016 by Karl Sigman 1 Acceptance-Rejection Method As we already know, finding an explicit formula for F 1 (y), y [0, 1], for the cdf of a rv X we wish to generate, F (x) = P (X x), x R, is
More informationUniform random numbers generators
Uniform random numbers generators Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: The need for random numbers; Basic steps in generation; Uniformly
More informationGenerating pseudo- random numbers
Generating pseudo- random numbers What are pseudo-random numbers? Numbers exhibiting statistical randomness while being generated by a deterministic process. Easier to generate than true random numbers?
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationIntroduction to queuing theory
Introduction to queuing theory Claude Rigault ENST claude.rigault@enst.fr Introduction to Queuing theory 1 Outline The problem The number of clients in a system The client process Delay processes Loss
More informationISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16)
1 NAME ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16) This test is 85 minutes. You are allowed two cheat sheets. Good luck! 1. Some short answer questions to get things going. (a) Consider the
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationMath Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah
Math 5040 1 Stochastic Processes & Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a randomnumber
More informationIntroduction to queuing theory
Introduction to queuing theory Queu(e)ing theory Queu(e)ing theory is the branch of mathematics devoted to how objects (packets in a network, people in a bank, processes in a CPU etc etc) join and leave
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationComputer Systems Modelling
Computer Systems Modelling Dr RJ Gibbens Computer Science Tripos, Part II Lent Term 2017/18 Last revised: 2018-01-10 (34a98a7) Department of Computer Science & Technology The Computer Laboratory University
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationSystems Simulation Chapter 6: Queuing Models
Systems Simulation Chapter 6: Queuing Models Fatih Cavdur fatihcavdur@uludag.edu.tr April 2, 2014 Introduction Introduction Simulation is often used in the analysis of queuing models. A simple but typical
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More information2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES
295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationCIVL Continuous Distributions
CIVL 3103 Continuous Distributions Learning Objectives - Continuous Distributions Define continuous distributions, and identify common distributions applicable to engineering problems. Identify the appropriate
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More informationIntroduction to Queuing Networks Solutions to Problem Sheet 3
Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationThe problems that follow illustrate the methods covered in class. They are typical of the types of problems that will be on the tests.
NUMERICAL ANALYSIS PRACTICE PROBLEMS JAMES KEESLING The problems that follow illustrate the methods covered in class. They are typical of the types of problems that will be on the tests.. Solving Equations
More informationUNIT 5:Random number generation And Variation Generation
UNIT 5:Random number generation And Variation Generation RANDOM-NUMBER GENERATION Random numbers are a necessary basic ingredient in the simulation of almost all discrete systems. Most computer languages
More informationSolution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.
Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the
More informationComputer Systems Modelling
Computer Systems Modelling Computer Laboratory Computer Science Tripos, Part II Lent Term 2010/11 R. J. Gibbens Problem sheet William Gates Building 15 JJ Thomson Avenue Cambridge CB3 0FD http://www.cl.cam.ac.uk/
More informationM/G/1 queues and Busy Cycle Analysis
queues and Busy Cycle Analysis John C.S. Lui Department of Computer Science & Engineering The Chinese University of Hong Kong www.cse.cuhk.edu.hk/ cslui John C.S. Lui (CUHK) Computer Systems Performance
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationQueueing Theory. VK Room: M Last updated: October 17, 2013.
Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server
More informationDUBLIN CITY UNIVERSITY
DUBLIN CITY UNIVERSITY SAMPLE EXAMINATIONS 2017/2018 MODULE: QUALIFICATIONS: Simulation for Finance MS455 B.Sc. Actuarial Mathematics ACM B.Sc. Financial Mathematics FIM YEAR OF STUDY: 4 EXAMINERS: Mr
More informationContinuous distributions
CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationComputer Networks More general queuing systems
Computer Networks More general queuing systems Saad Mneimneh Computer Science Hunter College of CUNY New York M/G/ Introduction We now consider a queuing system where the customer service times have a
More informationTHE INVERSE FUNCTION THEOREM
THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)
More informationGenerating Random Variables
Generating Random Variables These slides are created by Dr. Yih Huang of George Mason University. Students registered in Dr. Huang's courses at GMU can make a single machine-readable copy and print a single
More informationAn Analysis of the Preemptive Repeat Queueing Discipline
An Analysis of the Preemptive Repeat Queueing Discipline Tony Field August 3, 26 Abstract An analysis of two variants of preemptive repeat or preemptive repeat queueing discipline is presented: one in
More informationSolutions to COMP9334 Week 8 Sample Problems
Solutions to COMP9334 Week 8 Sample Problems Problem 1: Customers arrive at a grocery store s checkout counter according to a Poisson process with rate 1 per minute. Each customer carries a number of items
More informationA TANDEM QUEUEING SYSTEM WITH APPLICATIONS TO PRICING STRATEGY. Wai-Ki Ching. Tang Li. Sin-Man Choi. Issic K.C. Leung
Manuscript submitted to AIMS Journals Volume X, Number 0X, XX 00X Website: http://aimsciences.org pp. X XX A TANDEM QUEUEING SYSTEM WITH APPLICATIONS TO PRICING STRATEGY WAI-KI CHING SIN-MAN CHOI TANG
More informationPerformance Evaluation of Queuing Systems
Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems
More information1 Review of di erential calculus
Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationStochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring Luc Rey-Bellet
Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2010 Luc Rey-Bellet Contents 1 Random variables and Monte-Carlo method 3 1.1 Review of probability............................
More informationλ λ λ In-class problems
In-class problems 1. Customers arrive at a single-service facility at a Poisson rate of 40 per hour. When two or fewer customers are present, a single attendant operates the facility, and the service time
More informationEE 368. Weeks 3 (Notes)
EE 368 Weeks 3 (Notes) 1 State of a Queuing System State: Set of parameters that describe the condition of the system at a point in time. Why do we need it? Average size of Queue Average waiting time How
More informationGeneral Principles in Random Variates Generation
General Principles in Random Variates Generation E. Moulines and G. Fort Telecom ParisTech June 2015 Bibliography : Luc Devroye, Non-Uniform Random Variate Generator, Springer-Verlag (1986) available on
More informationNUMERICAL ANALYSIS PROBLEMS
NUMERICAL ANALYSIS PROBLEMS JAMES KEESLING The problems that follow illustrate the methods covered in class. They are typical of the types of problems that will be on the tests.. Solving Equations Problem.
More informationChapter 2 SIMULATION BASICS. 1. Chapter Overview. 2. Introduction
Chapter 2 SIMULATION BASICS 1. Chapter Overview This chapter has been written to introduce the topic of discreteevent simulation. To comprehend the material presented in this chapter, some background in
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationNICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1
NICTA Short Course Network Analysis Vijay Sivaraman Day 1 Queueing Systems and Markov Chains Network Analysis, 2008s2 1-1 Outline Why a short course on mathematical analysis? Limited current course offering
More informationMAS1302 Computational Probability and Statistics
MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great
More informationIE 581 Introduction to Stochastic Simulation
1. List criteria for choosing the majorizing density r (x) when creating an acceptance/rejection random-variate generator for a specified density function f (x). 2. Suppose the rate function of a nonhomogeneous
More informationMa 530 Power Series II
Ma 530 Power Series II Please note that there is material on power series at Visual Calculus. Some of this material was used as part of the presentation of the topics that follow. Operations on Power Series
More informationOBJECTIVES Use the area under a graph to find total cost. Use rectangles to approximate the area under a graph.
4.1 The Area under a Graph OBJECTIVES Use the area under a graph to find total cost. Use rectangles to approximate the area under a graph. 4.1 The Area Under a Graph Riemann Sums (continued): In the following
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationh(x) = f(x)/g(x) = (1 + x)e (1/2)x.
IEOR 4404 Homework 3 1. TEXT, Page 94, Exercise 20. (HINT: Consider g(x) = λe λx, for an appropriate value of λ.) We want to simulate from the density f(x) = 1 2 (1 + x)e x, x > 0. We can use an exponential
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationSlides 9: Queuing Models
Slides 9: Queuing Models Purpose Simulation is often used in the analysis of queuing models. A simple but typical queuing model is: Queuing models provide the analyst with a powerful tool for designing
More informationComputer Applications for Engineers ET 601
Computer Applications for Engineers ET 601 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Random Variables (Con t) 1 Office Hours: (BKD 3601-7) Wednesday 9:30-11:30 Wednesday 16:00-17:00 Thursday
More informationModelling the risk process
Modelling the risk process Krzysztof Burnecki Hugo Steinhaus Center Wroc law University of Technology www.im.pwr.wroc.pl/ hugo Modelling the risk process 1 Risk process If (Ω, F, P) is a probability space
More informationCourse Outline Introduction to Transportation Highway Users and their Performance Geometric Design Pavement Design
Course Outline Introduction to Transportation Highway Users and their Performance Geometric Design Pavement Design Speed Studies - Project Traffic Queuing Intersections Level of Service in Highways and
More informationCDA5530: Performance Models of Computers and Networks. Chapter 4: Elementary Queuing Theory
CDA5530: Performance Models of Computers and Networks Chapter 4: Elementary Queuing Theory Definition Queuing system: a buffer (waiting room), service facility (one or more servers) a scheduling policy
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationQueueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "
Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals
More informationChapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.
8. Preliminaries L, L Q, W, W Q L = average number of customers in the system. L Q = average number of customers waiting in queue. W = average number of time a customer spends in the system. W Q = average
More informationd(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N
Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x
More informationStochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet
Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................
More informationSTAT 516 Midterm Exam 2 Friday, March 7, 2008
STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More information1 st Midterm Exam Solution. Question #1: Student s Number. Student s Name. Answer the following with True or False:
1 st Midterm Exam Solution اس تعن ابهلل وكن عىل يقني بأ ن لك ما ورد يف هذه الورقة تعرفه جيدا وقد تدربت عليه مبا فيه الكفاية Student s Name Student s Number Question #1: Answer the following with True or
More informationGeneration from simple discrete distributions
S-38.3148 Simulation of data networks / Generation of random variables 1(18) Generation from simple discrete distributions Note! This is just a more clear and readable version of the same slide that was
More informationLecture 20. Randomness and Monte Carlo. J. Chaudhry. Department of Mathematics and Statistics University of New Mexico
Lecture 20 Randomness and Monte Carlo J. Chaudhry Department of Mathematics and Statistics University of New Mexico J. Chaudhry (UNM) CS 357 1 / 40 What we ll do: Random number generators Monte-Carlo integration
More informationPBW 654 Applied Statistics - I Urban Operations Research
PBW 654 Applied Statistics - I Urban Operations Research Lecture 2.I Queuing Systems An Introduction Operations Research Models Deterministic Models Linear Programming Integer Programming Network Optimization
More informationContinuous Time Processes
page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point
More informationCOMP9334 Capacity Planning for Computer Systems and Networks
COMP9334 Capacity Planning for Computer Systems and Networks Week 2: Operational Analysis and Workload Characterisation COMP9334 1 Last lecture Modelling of computer systems using Queueing Networks Open
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationNon Markovian Queues (contd.)
MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where
More informationComputer Systems Modelling
Computer Systems Modelling Computer Laboratory Computer Science Tripos, Part II Michaelmas Term 2003 R. J. Gibbens Problem sheet William Gates Building JJ Thomson Avenue Cambridge CB3 0FD http://www.cl.cam.ac.uk/
More informationLecture Notes 2 Random Variables. Random Variable
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014 Midterm Exam 2 Last name First name SID Rules. DO NOT open the exam until instructed
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationIntroduction to Markov Chains, Queuing Theory, and Network Performance
Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Systems 2/49 Continuous systems State changes continuously in time (e.g., in chemical applications) Discrete systems State is observed at fixed regular
More informationMonte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan
Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis
More informationBulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1
Advanced Markovian queues Bulk input queue M [X] /M/ Bulk service queue M/M [Y] / Erlangian queue M/E k / Bulk input queue M [X] /M/ Batch arrival, Poisson process, arrival rate λ number of customers in
More informationSimulation. Where real stuff starts
Simulation Where real stuff starts March 2019 1 ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3
More informationSTAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:
STAT/MA 46 Midterm Exam 2 Thursday, October 8, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing
More informationChapter 5: Special Types of Queuing Models
Chapter 5: Special Types of Queuing Models Some General Queueing Models Discouraged Arrivals Impatient Arrivals Bulk Service and Bulk Arrivals OR37-Dr.Khalid Al-Nowibet 1 5.1 General Queueing Models 1.
More information