Variance reduction techniques

Size: px
Start display at page:

Download "Variance reduction techniques"

Transcription

1 Variance reduction techniques Lecturer: Dmitri A. Moltchanov moltchan/modsim/

2 OUTLINE: Simulation with a given confidence; Variance reduction techniques; Antithetic variates technique; Control variates technique; Method of conditioning; Validation of simulations. Lecture: Variance reduction techniques 2

3 1. Simulation with a given confidence Two inverse tasks: we considered: what are the confidence intervals given N observation: is it OK to say that mean is 50 ± 30? what we are asked: provide a kind of assurance: we perhaps would like to say 50 ± 3... question: how many experiments are needed to achieve that? General notes: recall, width of confidence intervals is proportional to 1/ N; N: number of iid observations; the larger N the smaller is the interval. to halve the confidence interval: increase N four times: ( ) ŝ Ê[X] z α/2, Ê[X]+z ŝ α/2. (1) N N Lecture: Variance reduction techniques 3

4 Planning prior to simulations: we do not know how many observations are needed: what accuracy may N experiments give? e.g. how many observations are needed to get 50 ± 3? solution 1: carry out pilot experiment: aim 1: to get overall idea how things go; aim 2: obtain rough estimate of N providing required accuracy. solution 2: sequential in-simulation checking: test is carried out periodically to check whether required accuracy is achieved. We consider these methods for: batch mean method; method of replications. Lecture: Variance reduction techniques 4

5 1.1. Method of replications Use of pilot experiments: we want to estimate statistics a with the intervals ±0.1â: pilot experiments is carried out to collect N 1 replications; let â 1 be a point estimator of a; let 1 be the width of confidence intervals; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out main simulation with N 2 =( 1 /0.1â 1 ) 2 N 1 replications; we obtain new â 2 and 2 : they may not be exactly what we wanted (0.1â 2 ) due to randomness. if 1 > 0.1â 1 carry out new attempt. Lecture: Variance reduction techniques 5

6 Sequential in-simulation checking: we want to estimate statistics a with the intervals ±0.1â: we carry out N replications; we calculate â 1 and 1 out of these replications; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out N additional replications; we calculate new â 2 and 2 based on total 2N replications: if 2 0.1â 2 we stop; if 2 > 0.1â 2 we carry out N additional replications; repeat... Note: N may be set to 10. Lecture: Variance reduction techniques 6

7 1.2. Method of batch means Use of pilot experiments: we want to estimate statistics a with the intervals ±0.1â: gather k 1 batches; estimate â 1 and 1 ; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out k 2 k 1 additional batches have to be simulated; simply re-run for k 2 =( 1 /0.1â 1 ) 2 k 1 batches. calculate new â 2 and 2 based on total k 1 and k 2 k 1 batches; they may not be exactly what we wanted (0.1â 2 ) due to randomness. we may either: simulate k 2 k 1 batches (preferred); simulate k 2 + k 1 batches. Lecture: Variance reduction techniques 7

8 Use of pilot experiments: we want to estimate statistics a with the intervals ±0.1â: gather k batches; estimate â 1 and 1 ; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 gather k additional batches. calculate new â 2 and 2 based on total 2k batches; check the following: if 2 0.1â 2 we stop; if 2 > 0.1â 2 gather k additional batches. repeat... Note: N may be set to Lecture: Variance reduction techniques 8

9 2. Variance reduction techniques Suppose we have to estimate mean given N iid observations: point estimate of the mean is: Ê[X] = 1 N N X i, (2) i=1 confidence intervals for the mean are given by: ( ) ŝ Ê[X] z α/2, Ê[X]+z ŝ α/2. (3) N N where ŝ 2 is the estimate of the variance. How to shorten the confidence intervals: accuracy of an estimate can be increased by increasing the number of observations; shortcoming: may require very long simulations. accuracy can also be achieved by reducing variance. Lecture: Variance reduction techniques 9

10 General notes: techniques that tries to reduce variance: variance reduction techniques; requires additional computational complexity; it is not known in advance (prior to simulation) whether they actually reduce variance; To decide whether a technique helps to reduce variance: pilot experiments; in-simulation check. We consider variance reduction techniques: antithetic variates techniques; control variates technique; method of conditioning Lecture: Variance reduction techniques 10

11 3. Antithetic variates technique General notes: very simple technique; requires only few additions to the program; no guarantees of effectiveness Major shortcomings: no guarantees of effectiveness; no information in advance how much the variance will reduce. Take the following assumptions: iid observations (x (1) 1,x (1) 2,...,x (1) n ) are obtained in the first simulation; iid observations (x (2) 1,x (2) 2,...,x (2) n ) are obtained in the second simulation. Lecture: Variance reduction techniques 11

12 The idea of the method: define a new random variable Z =(X (1) + X (2) ): z i = x(1) i + x (2) i, 2 i =1, 2,...,n. (4) for mean of Z we have: ( ) X (1) + X (2) E[Z] =E = 1 ( E[X (1) + E[X (2) ] ) = E[X]. 2 2 (5) for variance of Z we have: ( ) X (1) + X (2) Var[Z] =Var 2 = 1 4 ( Var[X (1) ]+Var[X (2) ]+2Cov(X (1),X (2) ) ). (6) recall that Var[X (1) ]=Var[X (2) ]=Var[X], we have: Var[Z] = 1 2 (Var[X]+Cov(X(1),X (2) ). (7) Lecture: Variance reduction techniques 12

13 since Cov(X, Y )=ρ Var[X]Var[Y ], we have: where ρ is the correlation coefficient. Var[Z] = 1 Var[X](1 + ρ). (8) 2 next, use Z (instead of X) to determine confidence intervals as usual. How the reduction is achieved: observe Var[Z] and see that: also note the special case: Var[Z] Var[X] when ρ 1, Var[Z] 0 when ρ 1. (9) Var[Z] = 1 Var[X] when ρ =0. (10) 2 The idea: construct (x (1) 1,x (1) 2,...,x (1) n )and(x (2) 1,x (2) 2,...,x (2) n )suchthatρ<0. Lecture: Variance reduction techniques 13

14 Negative correlation: queuing system: X waiting times, Y interarrival times: Y is small and W is large and vice versa. How to create negative correlation in this example: let F (t) andg(s) be the cdf of interarrival and service time, respectively; let r i and v i be pseudo random numbers with U(0, 1); let t i = F 1 (r i )ands i = G 1 (v i ) be interarrival and service times associated with ith arrival; to determine whether queue increases or decreases consider d i = t i s i : negative: busy period, positive: empty period. consider the second run and associate numbers r i and s i with ith arrival so that: d i = t i s i has an opposite sign compared to d i ; this can be achieved using r i =1 r i and v i =1 v i. we have negative correlation in two runs! Lecture: Variance reduction techniques 14

15 How to implement: make the first run and get (x (1) 1,x (1) 2,...,x (1) n ); make the second run using r i =1 r i and v i =1 v i to get (x (2) 1,x (2) 2,...,x (2) n ); construct point and interval estimates using Z =(X (1) + X (2) )/2. Straightforward simulation: sampling every 10th customer. Method of antithetic variates: ± 1.76 for n = 300 (in overall ) Lecture: Variance reduction techniques 15

16 Lecture: Variance reduction techniques 16

17 Important note: this technique does not always provide better results; example: M/M/2 queue: results are only slightly better. Lecture: Variance reduction techniques 17

18 4. Control variates technique Also known as the method of control variable. Assume the following: X: endogenously created variable whose mean we have to estimate; Y : endogenously created variable whose mean is known in advance; Y is strongly correlated with X. When Y and X are negatively correlated: define Z = X + Y E[Y ], we have: E[Z] =E[X + Y E[Y ]] = E[X], Var[Z] =Var[X]+Var[Y ]+2Cov(X, Y ). (11) since Y and X are negatively correlated we have that Cov(X, Y ) < 0; if Var[Y ]+2Cov(X, Y ) < 0 we reduce the variance. Lecture: Variance reduction techniques 18

19 When Y and X are positively correlated: define Z = X Y + E[Y ], we have: E[Z] =E[X Y + E[Y ]] = E[X], Var[Z] =Var[X]+Var[Y ] 2 Cov(X, Y ). (12) since Y and X are positively correlated we have that Cov(X, Y ) > 0; if Var[Y ] 2Cov(X, Y ) < 0 we reduce the variance. Example: queuing system: X waiting times, Y interarrival times: Y is small and X is large and vice versa: negative correlation: get observations (x 1,x 2,...,x n )and(y 1,y 2,...,y n ) and let: z i = x i + y i E[Y ], i =1, 2,...,n. (13) construct confidence intervals for E[X] usingê[z] ± 1.96s/ n, where Ê[Z] = 1 n z i, ŝ 2 [Z] = 1 n (z i n n 1 Ê[Z])2. (14) i=1 i=1 Lecture: Variance reduction techniques 19

20 General view of the approach: RV Z can be obtained as follows: a is some constant to be estimated; X and Y are positively or negatively correlated RVs. we have that E[Z] =E[X] and for Var[Z]: Z has a smaller variance than X if: we have to select a to minimize RHS, we have: Z = X a(y E[Y ]), (15) Var[Z] =Var[X]+a 2 Var[Y ] 2aCov(X, Y ). (16) a 2 Var[Y ] 2aCov(X, Y ) < 0. (17) 2aV ar[y ] 2Cov(X, Y )=0, a = Cov(X, Y ). (18) Var[Y ] Lecture: Variance reduction techniques 20

21 substituting into expression for Var[Z] we have: Var[Z] =Var[X] [Cov(X, Y )]2 Var[Y ] =(1 ρ 2 XY )Var[X]. (19) note: if X and Y are correlated, we always get reduction of variance if a is optimal: detection of a requires knowledge of Var[Y ]andcov(x, Y ); pilot experiments can be used to get estimates of Var[Y ]andcov(x, Y ). we can also further generalize to get: m Z = X a i (Y i E[Y i ]), (20) i=1 where a i, i =1, 2,...,m are any real numbers. Important notes: this method may give assurance after pilot experiment; complexity if a little bit higher than that of the antithetic variates. Lecture: Variance reduction techniques 21

22 5. Method of conditioning The basis of the method: X and Y are arbitrary random numbers; for X and Y we have: E[X] =E[E[X Y ]], Var[X] =E[Var[X Y ]] + Var[E[X Y ]] Var[E[X Y ]]. (21) note that RV E[X Y ] has the same mean and smaller variance than RV X. The idea of the method: let E[X] be the performance measure we have to estimate; if RV Y is such that Y = E[X Y = y] is known: we simulate Y ; we use E[X Y ] as an estimate. Lecture: Variance reduction techniques 22

23 Example: M/M/1/N queuing system: performance measure E[X] the mean number of lost customers at time t; direct simulation: K runs until time t; of X i is the number of lost customers in run i, we estimate mean as: Ê[X] = 1 K K X i. (22) i=1 we reduce the variance of the point estimator as follows: let Y i be the total amount of time in (0,t] that there are N customers in ith run; since we have Poisson arrival process we have: E[X Y i ]=λy i. (23) estimator with reduced variance is then: Ê[Y ]= 1 K K Y i. (24) i=1 Lecture: Variance reduction techniques 23

24 Example: numerical results: λ =0.5, µ =1,N =3,t = 1000; we reduce standard deviation by 1.3. Figure 1: Estimation of E[X] without conditioning. Figure 2: Estimation of E[X] using conditioning. Lecture: Variance reduction techniques 24

25 6. Validation of the simulation model General notes: how close we are to the reality? how confident we are that the simulation results are accurate? vary important issue and very often neglected. Example: develop a certain switch: we have to know its performance in advance; the only way: model the system either analytically of vie simulation studies: almost no way to make validation. Example: extend a switch: we have to know its performance in advance; we can use the following approach: simulate the real system and compare results; extend the simulation model and simulate. Lecture: Variance reduction techniques 25

26 The following checks can be used: check the pseudo-random numbers generators; independence, uniformity, etc. check the generator of arbitrary random numbers; independence or dependence tests, fitting criteria (e.g. χ 2 ), etc. check the logic of the simulation program; print out small samples of variables, event list, data structures for further hand checking. check the relationship validity; print out all variables for small parts of run and check whether they are correctly updated. check the output validity; possible only if empirical data are available. Some important notes: these check must be performed if you get what was not expected; but: do not try to get what you expect!!! Lecture: Variance reduction techniques 26

Variance reduction techniques

Variance reduction techniques Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Simulation with a given accuracy; Variance reduction techniques;

More information

7 Variance Reduction Techniques

7 Variance Reduction Techniques 7 Variance Reduction Techniques In a simulation study, we are interested in one or more performance measures for some stochastic model. For example, we want to determine the long-run average waiting time,

More information

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18 Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We

More information

X i. X(n) = 1 n. (X i X(n)) 2. S(n) n

X i. X(n) = 1 n. (X i X(n)) 2. S(n) n Confidence intervals Let X 1, X 2,..., X n be independent realizations of a random variable X with unknown mean µ and unknown variance σ 2. Sample mean Sample variance X(n) = 1 n S 2 (n) = 1 n 1 n i=1

More information

Optimization and Simulation

Optimization and Simulation Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.

More information

Uniform random numbers generators

Uniform random numbers generators Uniform random numbers generators Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: The need for random numbers; Basic steps in generation; Uniformly

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Design of discrete-event simulations

Design of discrete-event simulations Design of discrete-event simulations Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: Discrete event simulation; Event advance design; Unit-time

More information

Discrete-event simulations

Discrete-event simulations Discrete-event simulations Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Why do we need simulations? Step-by-step simulations; Classifications;

More information

GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

More information

2WB05 Simulation Lecture 7: Output analysis

2WB05 Simulation Lecture 7: Output analysis 2WB05 Simulation Lecture 7: Output analysis Marko Boon http://www.win.tue.nl/courses/2wb05 December 17, 2012 Outline 2/33 Output analysis of a simulation Confidence intervals Warm-up interval Common random

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16)

ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16) 1 NAME ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16) This test is 85 minutes. You are allowed two cheat sheets. Good luck! 1. Some short answer questions to get things going. (a) Consider the

More information

STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning

STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning Pengyuan (Penelope) Wang Lecture 16-17 June 21, 2011 Computing Probability by Conditioning A is an arbitrary event If Y is a discrete random

More information

Chapter 2 Queueing Theory and Simulation

Chapter 2 Queueing Theory and Simulation Chapter 2 Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University,

More information

Simulation of stationary processes. Timo Tiihonen

Simulation of stationary processes. Timo Tiihonen Simulation of stationary processes Timo Tiihonen 2014 Tactical aspects of simulation Simulation has always a goal. How to organize simulation to reach the goal sufficiently well and without unneccessary

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Queueing Theory and Simulation. Introduction

Queueing Theory and Simulation. Introduction Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University, Japan

More information

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic ECE 3511: Communications Networks Theory and Analysis Fall Quarter 2002 Instructor: Prof. A. Bruce McDonald Lecture Topic Introductory Analysis of M/G/1 Queueing Systems Module Number One Steady-State

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

Simulation. Where real stuff starts

Simulation. Where real stuff starts 1 Simulation Where real stuff starts ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3 What is a simulation?

More information

Network Simulation Chapter 6: Output Data Analysis

Network Simulation Chapter 6: Output Data Analysis Network Simulation Chapter 6: Output Data Analysis Prof. Dr. Jürgen Jasperneite 1 Contents Introduction Types of simulation output Transient detection When to terminate a simulation 2 Prof. Dr. J ürgen

More information

Final Solutions Fri, June 8

Final Solutions Fri, June 8 EE178: Probabilistic Systems Analysis, Spring 2018 Final Solutions Fri, June 8 1. Small problems (62 points) (a) (8 points) Let X 1, X 2,..., X n be independent random variables uniformly distributed on

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

Final Examination Solutions (Total: 100 points)

Final Examination Solutions (Total: 100 points) Final Examination Solutions (Total: points) There are 4 problems, each problem with multiple parts, each worth 5 points. Make sure you answer all questions. Your answer should be as clear and readable

More information

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID: Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam 1 2 3 4 5 6 7 8 9 10 7 questions. 1. [5+5] Let X and Y be independent exponential random variables where X has

More information

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y) HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have

More information

IE 581 Introduction to Stochastic Simulation

IE 581 Introduction to Stochastic Simulation 1. List criteria for choosing the majorizing density r (x) when creating an acceptance/rejection random-variate generator for a specified density function f (x). 2. Suppose the rate function of a nonhomogeneous

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Overall Plan of Simulation and Modeling I. Chapters

Overall Plan of Simulation and Modeling I. Chapters Overall Plan of Simulation and Modeling I Chapters Introduction to Simulation Discrete Simulation Analytical Modeling Modeling Paradigms Input Modeling Random Number Generation Output Analysis Continuous

More information

Lecture 4: September Reminder: convergence of sequences

Lecture 4: September Reminder: convergence of sequences 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 4: September 6 In this lecture we discuss the convergence of random variables. At a high-level, our first few lectures focused

More information

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

More information

Variance reduction. Timo Tiihonen

Variance reduction. Timo Tiihonen Variance reduction Timo Tiihonen 2014 Variance reduction techniques The most efficient way to improve the accuracy and confidence of simulation is to try to reduce the variance of simulation results. We

More information

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data Simulation Alberto Ceselli MSc in Computer Science Univ. of Milan Part 4 - Statistical Analysis of Simulated Data A. Ceselli Simulation P.4 Analysis of Sim. data 1 / 15 Statistical analysis of simulated

More information

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model

Chapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Queuing Theory. Using the Math. Management Science

Queuing Theory. Using the Math. Management Science Queuing Theory Using the Math 1 Markov Processes (Chains) A process consisting of a countable sequence of stages, that can be judged at each stage to fall into future states independent of how the process

More information

λ λ λ In-class problems

λ λ λ In-class problems In-class problems 1. Customers arrive at a single-service facility at a Poisson rate of 40 per hour. When two or fewer customers are present, a single attendant operates the facility, and the service time

More information

B. Maddah INDE 504 Discrete-Event Simulation. Output Analysis (1)

B. Maddah INDE 504 Discrete-Event Simulation. Output Analysis (1) B. Maddah INDE 504 Discrete-Event Simulation Output Analysis (1) Introduction The basic, most serious disadvantage of simulation is that we don t get exact answers. Two different runs of the same model

More information

HITTING TIME IN AN ERLANG LOSS SYSTEM

HITTING TIME IN AN ERLANG LOSS SYSTEM Probability in the Engineering and Informational Sciences, 16, 2002, 167 184+ Printed in the U+S+A+ HITTING TIME IN AN ERLANG LOSS SYSTEM SHELDON M. ROSS Department of Industrial Engineering and Operations

More information

Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1

Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1 Advanced Markovian queues Bulk input queue M [X] /M/ Bulk service queue M/M [Y] / Erlangian queue M/E k / Bulk input queue M [X] /M/ Batch arrival, Poisson process, arrival rate λ number of customers in

More information

S6880 #13. Variance Reduction Methods

S6880 #13. Variance Reduction Methods S6880 #13 Variance Reduction Methods 1 Variance Reduction Methods Variance Reduction Methods 2 Importance Sampling Importance Sampling 3 Control Variates Control Variates Cauchy Example Revisited 4 Antithetic

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

STA510. Solutions to exercises. 3, fall 2012

STA510. Solutions to exercises. 3, fall 2012 Solutions-3, 3. oktober 212; s. 1 STA51. Solutions to eercises. 3, fall 212 See R script le 'problem-set-3-h-12.r'. Problem 1 Histogram of y.non.risk The histograms to the right show the simulated distributions

More information

18.440: Lecture 25 Covariance and some conditional expectation exercises

18.440: Lecture 25 Covariance and some conditional expectation exercises 18.440: Lecture 25 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Outline Covariance and correlation A property of independence If X and Y

More information

Chapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Chapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Chapter 11 Output Analysis for a Single Model Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose Objective: Estimate system performance via simulation If q is the system performance,

More information

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Question 1. Suppose you want to estimate the percentage of

More information

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017 CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Quote of the Day A person with one watch knows what time it is. A person with two

More information

Estimation of arrival and service rates for M/M/c queue system

Estimation of arrival and service rates for M/M/c queue system Estimation of arrival and service rates for M/M/c queue system Katarína Starinská starinskak@gmail.com Charles University Faculty of Mathematics and Physics Department of Probability and Mathematical Statistics

More information

Last few slides from last time

Last few slides from last time Last few slides from last time Example 3: What is the probability that p will fall in a certain range, given p? Flip a coin 50 times. If the coin is fair (p=0.5), what is the probability of getting an

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

5.2 Expounding on the Admissibility of Shrinkage Estimators

5.2 Expounding on the Admissibility of Shrinkage Estimators STAT 383C: Statistical Modeling I Fall 2015 Lecture 5 September 15 Lecturer: Purnamrita Sarkar Scribe: Ryan O Donnell Disclaimer: These scribe notes have been slightly proofread and may have typos etc

More information

18.600: Lecture 24 Covariance and some conditional expectation exercises

18.600: Lecture 24 Covariance and some conditional expectation exercises 18.600: Lecture 24 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Paradoxes: getting ready to think about conditional expectation Outline Covariance

More information

1.225J J (ESD 205) Transportation Flow Systems

1.225J J (ESD 205) Transportation Flow Systems 1.225J J (ESD 25) Transportation Flow Systems Lecture 9 Simulation Models Prof. Ismail Chabini and Prof. Amedeo R. Odoni Lecture 9 Outline About this lecture: It is based on R16. Only material covered

More information

Output Data Analysis for a Single System

Output Data Analysis for a Single System Output Data Analysis for a Single System Chapter 9 Based on the slides provided with the textbook 2 9.1 Introduction Output data analysis is often not conducted appropriately Treating output of a single

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/25/17. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/25/17. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA 1 / 26 Queueing Review (mostly from BCNN) Christos Alexopoulos and Dave Goldsman Georgia Institute of Technology, Atlanta, GA, USA 10/25/17 2 / 26 Outline 1 Introduction 2 Queueing Notation 3 Transient

More information

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA 1 / 24 Queueing Review (mostly from BCNN) Christos Alexopoulos and Dave Goldsman Georgia Institute of Technology, Atlanta, GA, USA 10/6/16 2 / 24 Outline 1 Introduction 2 Queueing Notation 3 Transient

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems  M/M/1  M/M/m  M/M/1/K Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals

More information

Stochastic Simulation Variance reduction methods Bo Friis Nielsen

Stochastic Simulation Variance reduction methods Bo Friis Nielsen Stochastic Simulation Variance reduction methods Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: bfni@dtu.dk Variance reduction

More information

Computer Networks More general queuing systems

Computer Networks More general queuing systems Computer Networks More general queuing systems Saad Mneimneh Computer Science Hunter College of CUNY New York M/G/ Introduction We now consider a queuing system where the customer service times have a

More information

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO QUESTION BOOKLET EE 126 Spring 26 Midterm #2 Thursday, April 13, 11:1-12:3pm DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 8 minutes to complete the midterm. The midterm consists

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

LECTURE 1. Introduction to Econometrics

LECTURE 1. Introduction to Econometrics LECTURE 1 Introduction to Econometrics Ján Palguta September 20, 2016 1 / 29 WHAT IS ECONOMETRICS? To beginning students, it may seem as if econometrics is an overly complex obstacle to an otherwise useful

More information

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better.

Solution: By Markov inequality: P (X > 100) 0.8. By Chebyshev s inequality: P (X > 100) P ( X 80 > 20) 100/20 2 = The second estimate is better. MA 485-1E, Probability (Dr Chernov) Final Exam Wed, Dec 12, 2001 Student s name Be sure to show all your work. Each problem is 4 points. Full credit will be given for 9 problems (36 points). You are welcome

More information

Simulation. Where real stuff starts

Simulation. Where real stuff starts Simulation Where real stuff starts March 2019 1 ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3

More information

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16) 1 NAME ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16) You have 85 minutes. You get one cheat sheet. Put your succinct answers below. All questions are 3 points, unless indicated. You get 1 point

More information

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51 Yi Lu Correlation and Covariance Yi Lu ECE 313 2/51 Definition Let X and Y be random variables with finite second moments. the correlation: E[XY ] Yi Lu ECE 313 3/51 Definition Let X and Y be random variables

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology .203J/6.28J/3.665J/5.073J/6.76J/ESD.26J Quiz Solutions (a)(i) Without loss of generality we can pin down X at any fixed point. X 2 is still uniformly distributed over the square. Assuming that the police

More information

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0. () () a. X is a binomial distribution with n = 000, p = /6 b. The expected value, variance, and standard deviation of X is: E(X) = np = 000 = 000 6 var(x) = np( p) = 000 5 6 666 stdev(x) = np( p) = 000

More information

ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles).

ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles). ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Set up standard example and notation. Review performance measures (means, probabilities and quantiles). A framework for conducting simulation experiments

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

ECON Fundamentals of Probability

ECON Fundamentals of Probability ECON 351 - Fundamentals of Probability Maggie Jones 1 / 32 Random Variables A random variable is one that takes on numerical values, i.e. numerical summary of a random outcome e.g., prices, total GDP,

More information

Contents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii

Contents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii LIST OF TABLES... xv LIST OF FIGURES... xvii LIST OF LISTINGS... xxi PREFACE...xxiii CHAPTER 1. PERFORMANCE EVALUATION... 1 1.1. Performance evaluation... 1 1.2. Performance versus resources provisioning...

More information

IOE 202: lectures 11 and 12 outline

IOE 202: lectures 11 and 12 outline IOE 202: lectures 11 and 12 outline Announcements Last time... Queueing models intro Performance characteristics of a queueing system Steady state analysis of an M/M/1 queueing system Other queueing systems,

More information

Lecture 16: Hierarchical models and miscellanea

Lecture 16: Hierarchical models and miscellanea Lecture 16: Hierarchical models and miscellanea It is often easier to model a practical situation by thinking of things in a hierarchy. Example 4.4.1 (binomial-poisson hierarchy) An insect lays many eggs,

More information

Analysis of Software Artifacts

Analysis of Software Artifacts Analysis of Software Artifacts System Performance I Shu-Ngai Yeung (with edits by Jeannette Wing) Department of Statistics Carnegie Mellon University Pittsburgh, PA 15213 2001 by Carnegie Mellon University

More information

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

ISyE 3044 Fall 2015 Test #1 Solutions

ISyE 3044 Fall 2015 Test #1 Solutions 1 NAME ISyE 3044 Fall 2015 Test #1 Solutions You have 85 minutes. You get one cheat sheet. Put your succinct answers below. All questions are 3 points, unless indicated. You get 1 point for writing your

More information

Tutorial 1 : Probabilities

Tutorial 1 : Probabilities Lund University ETSN01 Advanced Telecommunication Tutorial 1 : Probabilities Author: Antonio Franco Emma Fitzgerald Tutor: Farnaz Moradi January 11, 2016 Contents I Before you start 3 II Exercises 3 1

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE53 Handout #7 Prof. Young-Han Kim Tuesday, May 6, 4 Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei). Neural net. Let Y = X + Z, where the signal X U[,] and noise Z N(,) are independent.

More information

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π Solutions to Homework Set #5 (Prepared by Lele Wang). Neural net. Let Y X + Z, where the signal X U[,] and noise Z N(,) are independent. (a) Find the function g(y) that minimizes MSE E [ (sgn(x) g(y))

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What

More information

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017 CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Motivating Quote for Queueing Models Good things come to those who wait - poet/writer

More information