Variance reduction techniques
|
|
- Walter Thornton
- 5 years ago
- Views:
Transcription
1 Variance reduction techniques Lecturer: Dmitri A. Moltchanov
2 OUTLINE: Simulation with a given accuracy; Variance reduction techniques; Antithetic variates technique; Control variates technique; Validation of simulations. Lecture: Variance reduction techniques 2
3 1. Simulation with a given accuracy Two inverse tasks: we considered: what are the confidence intervals given N observation: is it ok if we got mean estimate, say, 50 ± 30 with confidence 95%? what we are always asked: provide a kind of assurance/guarantees: say what would be the value within intervals 50 ± 3 with confidence 95%. question: how to get this? General notes: recall, width of confidence intervals is proportional to 1/ N; N: number of iid observations; the larger N the smaller is the interval. to halve the confidence interval: increase N four times: ( ) ŝ Ê[X] z α/2, Ê[X] + z ŝ α/2. (1) N N Lecture: Variance reduction techniques 3
4 Planning prior to simulations: we do not know how many observations are needed: what accuracy may N experiments give? e.g. how many observations are needed to get 50 ± 3? solution 1: carry out pilot experiment: aim 1: to get overall idea how things go; aim 2: obtain rough estimate of N providing required accuracy. solution 2: sequential in-simulation checking: test is carried out periodically to check whether required accuracy is achieved. We consider these methods for: batch mean method; method of replications. Lecture: Variance reduction techniques 4
5 1.1. Method of replications Use of pilot experiments: we want to estimate statistics a with the intervals ±0.1â: pilot experiments is carried out to collect N 1 replications; let â 1 be a point estimator of a; let 1 be the width of confidence intervals; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out main simulation with N 2 = ( 1 /0.1â 1 ) 2 N 1 replications; we obtain new â 2 and 2 : they may not be exactly what we wanted (0.1â 2 ) due to randomness. if 1 > 0.1â 1 carry out new attempt. Lecture: Variance reduction techniques 5
6 How did we decide how many replications are needed? N 2 = ( 1 0.1â 1 ) 2 N 1. (2) recall the interval estimate of the mean ( Ê[X] z α/2 ) ŝ, Ê[X] + z ŝ α/2. (3) N N assuming ŝ, z α/2 are constant (they are not!) for different N 1, N N1 0.1a 1 1 N2. (4) solving for N 2 1 N2 = 0.1a 1 N1 1 N 1 1 = 0.1a 1 N2 N 2 = ( 1 0.1â 1 ) 2 N 1. (5) Lecture: Variance reduction techniques 6
7 Sequential in-simulation checking: we want to estimate statistics a with the intervals ±0.1â: we get N replications; we calculate â 1 and 1 out of these replications; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out N additional replications; we calculate new â 2 and 2 based on total 2N replications: if 2 0.1â 2 we stop; if 2 > 0.1â 2 we carry out N additional replications; repeat... Note: N can be set to 10. Lecture: Variance reduction techniques 7
8 1.2. Method of batch means Use of pilot experiments: we want to estimate statistics a with the intervals ±0.1â: gather k 1 batches; estimate â 1 and 1 ; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 we carry out k 2 k 1 additional batches have to be simulated; simply re-run for k 2 = ( 1 /0.1â 1 ) 2 k 1 batches. calculate new â 2 and 2 based on total k 1 and k 2 k 1 batches; they may not be exactly what we wanted (0.1â 2 ) due to randomness. in the re-run we may either: simulate k 2 k 1 batches (preferred); simulate k 2 + k 1 batches. Lecture: Variance reduction techniques 8
9 Sequential in-simulations checking: we want to estimate statistics a with the intervals ±0.1â: gather k batches; estimate â 1 and 1 ; check the following: if 1 0.1â 1 we stop; if 1 > 0.1â 1 gather k additional batches. calculate new â 2 and 2 based on total 2k batches; check the following: if 2 0.1â 2 we stop; if 2 > 0.1â 2 gather k additional batches. repeat... Note: N can be set to Lecture: Variance reduction techniques 9
10 2. Variance reduction techniques Suppose we have to estimate mean given N iid observations: point estimate of the mean is: Ê[X] = 1 N N X i, (6) i=1 confidence intervals for the mean are given by: ( ) ŝ Ê[X] z α/2, Ê[X] + z ŝ α/2. (7) N N where ŝ 2 is the estimate of the variance. How to shorten the confidence intervals: accuracy of an estimate can be increased by increasing the number of observations; shortcoming: may require very long simulations. accuracy can also be achieved by reducing variance. Lecture: Variance reduction techniques 10
11 General notes: techniques that tries to reduce variance: variance reduction techniques; requires additional computational complexity; it is not known in advance (prior to simulation) whether they actually reduce variance; To decide whether a technique helps to reduce variance: pilot experiments; in-simulation checking. We consider variance reduction techniques: antithetic variates technique; control variates technique. Lecture: Variance reduction techniques 11
12 3. Antithetic variates technique General notes: very simple technique; requires only few additions to the program; no guarantees of effectiveness Major shortcomings: no guarantees of effectiveness; no information in advance how much the variance will reduce. Take the following assumptions: iid observations (x (1) 1, x (1) 2,..., x (1) n ) are obtained in the first simulation; iid observations (x (2) 1, x (2) 2,..., x (2) n ) are obtained in the second simulation. Lecture: Variance reduction techniques 12
13 The idea of the method: define a new random variable Z = (X (1) + X (2) ): z i = x(1) i + x (2) i, 2 i = 1, 2,..., n. (8) for mean of Z we have: ( ) X (1) + X (2) E[Z] = E = 1 ( E[X (1) + E[X (2) ] ) = E[X]. 2 2 (9) for variance of Z we have: ( ) X (1) + X (2) V ar[z] = V ar 2 = 1 4 ( V ar[x (1) ] + V ar[x (2) ] + 2Cov(X (1), X (2) ) ). (10) recalling that V ar[x (1) ] = V ar[x (2) ] = V ar[x], we have: V ar[z] = 1 2 (V ar[x] + Cov(X(1), X (2) ). (11) Lecture: Variance reduction techniques 13
14 since Cov(X, Y ) = Cov(X,Y ) V ar[x]v ar[y ] = ρ V ar[x]v ar[y ], we have: V ar[x]v ar[y ] where ρ is the correlation coefficient. V ar[z] = 1 V ar[x](1 + ρ). (12) 2 next, use Z (instead of X) to determine confidence intervals as usual. How the reduction is achieved: observe V ar[z] and see that: also note the special case: V ar[z] V ar[x] when ρ 1, V ar[z] 0 when ρ 1. (13) V ar[z] = 1 V ar[x] when ρ = 0. (14) 2 How to benefit: construct (x (1) 1, x (1) 2,..., x (1) n ) and (x (2) 1, x (2) 2,..., x (2) n ) such that ρ < 0. Lecture: Variance reduction techniques 14
15 Let be interested in X waiting times: we can control Y and Z interarrival and service times; (Y small, Z large) W large, (Y large, Z small) W small. How to create negative correlation in this example: let F (t) and G(s) be CDFs of interarrival and service time, respectively; let r i and v i be pseudo random numbers with U(0, 1); let t i = F 1 (r i ) and s i = G 1 (v i ) be interarrival and service times associated with ith arrival; to determine whether queue increases or decreases consider d i = t i s i : negative: busy period, positive: empty period. consider the second run and associate numbers r i and s i with ith arrival so that: d i = t i s i has an opposite sign compared to d i ; this can be achieved using r i = 1 r i and v i = 1 v i. we have negative correlation in two runs! Lecture: Variance reduction techniques 15
16 How to implement: make the first run and get (x (1) 1, x (1) 2,..., x (1) n ); make the second run using r i = 1 r i and v i = 1 v i to get (x (2) 1, x (2) 2,..., x (2) n ); construct point and interval estimates using Z = (X (1) + X (2) )/2. Straightforward simulation: sampling every 10th customer. Method of antithetic variates: ± 1.76 for n = 300 (in overall ) Lecture: Variance reduction techniques 16
17 Lecture: Variance reduction techniques 17
18 Important note: this technique does not always provide better results; example: M/M/2 queue: results are only slightly better. Lecture: Variance reduction techniques 18
19 4. Control variates technique Also known as the method of control variable. Assume the following: X: variable whose mean we have to estimate; Y : variable whose mean is known in advance; Y is strongly correlated with X. When Y and X are negatively correlated: define Z = X + Y E[Y ], we have: E[Z] = E[X + Y E[Y ]] = E[X], V ar[z] = V ar[x] + V ar[y ] + 2Cov(X, Y ). (15) since Y and X are negatively correlated we have that Cov(X, Y ) < 0; if V ar[y ] + 2Cov(X, Y ) < 0 we reduce the variance. Lecture: Variance reduction techniques 19
20 When Y and X are positively correlated: define Z = X Y + E[Y ], we have: E[Z] = E[X Y + E[Y ]] = E[X], V ar[z] = V ar[x] + V ar[y ] 2Cov(X, Y ). (16) since Y and X are positively correlated we have that Cov(X, Y ) > 0; if V ar[y ] 2Cov(X, Y ) < 0 we reduce the variance. Example: queuing system: X waiting times, Y interarrival times: Y is small and X is large and vice versa: negative correlation: get observations (x 1, x 2,..., x n ) and (y 1, y 2,..., y n ) and let: z i = x i + y i E[Y ], i = 1, 2,..., n. (17) construct confidence intervals for E[X] using Ê[Z] ± 1.96s/ n, where Ê[Z] = 1 n z i, ŝ 2 [Z] = 1 n (z i n n 1 Ê[Z])2. (18) i=1 i=1 Lecture: Variance reduction techniques 20
21 General view of the approach: RV Z can be obtained as follows: a is some constant to be estimated; X and Y are positively or negatively correlated RVs. we have that E[Z] = E[X] and for V ar[z]: note that Z has a smaller variance than X if: Z = X a(y E[Y ]), (19) V ar[z] = V ar[x] + a 2 V ar[y ] 2aCov(X, Y ). (20) a 2 V ar[y ] 2aCov(X, Y ) < 0. (21) we have to select a to minimize RHS of (20), we have to find minimum: 2aV ar[y ] 2Cov(X, Y ) = 0, a = Cov(X, Y ). (22) V ar[y ] Lecture: Variance reduction techniques 21
22 substituting into expression for V ar[z] we have: V ar[z] = V ar[x] [Cov(X, Y )]2 V ar[y ] = (1 ρ 2 XY )V ar[x]. (23) note: if X and Y are correlated, we always get reduction of variance if a is optimal: detection of a requires knowledge of V ar[y ] and Cov(X, Y ); pilot experiments can be used to get estimates of V ar[y ] and Cov(X, Y ). we can also further generalize to get: m Z = X a i (Y i E[Y i ]), (24) i=1 where a i, i = 1, 2,..., m are real numbers. Important notes: this method may give assurance after pilot experiment; complexity if a little bit higher than that of the antithetic variates. Lecture: Variance reduction techniques 22
23 5. Validation of the simulation model General notes: how close we are to the reality? how confident we are that the simulation results are accurate? vary important step and very often just skipped. Example: develop a certain switch: we have to know its performance in advance; the only way: model the system either analytically of using simulation studies: almost no way to make validation. Example: extend a switch: we have to know its performance in advance; we can use the following approach: simulate the real system and compare results; extend the simulation model and simulate. Lecture: Variance reduction techniques 23
24 The following checks can be used: check the pseudo-random numbers generators; independence, uniformity, etc. check the generator of arbitrary random numbers; independence or dependence tests, fitting criteria (e.g. χ 2 ), etc. check the logic of the simulation program; print out small samples of variables, event list, data structures for further hand checking. check the relationship validity; print out all variables for small parts of run and check whether they are correctly updated. check the output validity; possible only if empirical data are available. Some important notes: these check must be performed if you get what was not expected; but: do not try to get what you expect!!! Lecture: Variance reduction techniques 24
Variance reduction techniques
Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/ moltchan/modsim/ http://www.cs.tut.fi/kurssit/tlt-2706/ OUTLINE: Simulation with a given confidence;
More informationUniform random numbers generators
Uniform random numbers generators Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: The need for random numbers; Basic steps in generation; Uniformly
More informationDesign of discrete-event simulations
Design of discrete-event simulations Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: Discrete event simulation; Event advance design; Unit-time
More informationVariance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18
Variance reduction p. 1/18 Variance reduction Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Variance reduction p. 2/18 Example Use simulation to compute I = 1 0 e x dx We
More informationDiscrete-event simulations
Discrete-event simulations Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Why do we need simulations? Step-by-step simulations; Classifications;
More informationGI/M/1 and GI/M/m queuing systems
GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationOptimization and Simulation
Optimization and Simulation Variance reduction Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne M.
More information7 Variance Reduction Techniques
7 Variance Reduction Techniques In a simulation study, we are interested in one or more performance measures for some stochastic model. For example, we want to determine the long-run average waiting time,
More informationSimulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data
Simulation Alberto Ceselli MSc in Computer Science Univ. of Milan Part 4 - Statistical Analysis of Simulated Data A. Ceselli Simulation P.4 Analysis of Sim. data 1 / 15 Statistical analysis of simulated
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationMore than one variable
Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to
More informationIntelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham
Intelligent Data Analysis Principal Component Analysis Peter Tiňo School of Computer Science University of Birmingham Discovering low-dimensional spatial layout in higher dimensional spaces - 1-D/3-D example
More informationBusiness Statistics 41000: Homework # 2 Solutions
Business Statistics 4000: Homework # 2 Solutions Drew Creal February 9, 204 Question #. Discrete Random Variables and Their Distributions (a) The probabilities have to sum to, which means that 0. + 0.3
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationNotes on Random Processes
otes on Random Processes Brian Borchers and Rick Aster October 27, 2008 A Brief Review of Probability In this section of the course, we will work with random variables which are denoted by capital letters,
More informationECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic
ECE 3511: Communications Networks Theory and Analysis Fall Quarter 2002 Instructor: Prof. A. Bruce McDonald Lecture Topic Introductory Analysis of M/G/1 Queueing Systems Module Number One Steady-State
More informationSimulation. Where real stuff starts
1 Simulation Where real stuff starts ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3 What is a simulation?
More information2WB05 Simulation Lecture 7: Output analysis
2WB05 Simulation Lecture 7: Output analysis Marko Boon http://www.win.tue.nl/courses/2wb05 December 17, 2012 Outline 2/33 Output analysis of a simulation Confidence intervals Warm-up interval Common random
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationB. Maddah INDE 504 Discrete-Event Simulation. Output Analysis (1)
B. Maddah INDE 504 Discrete-Event Simulation Output Analysis (1) Introduction The basic, most serious disadvantage of simulation is that we don t get exact answers. Two different runs of the same model
More informationNetwork Simulation Chapter 6: Output Data Analysis
Network Simulation Chapter 6: Output Data Analysis Prof. Dr. Jürgen Jasperneite 1 Contents Introduction Types of simulation output Transient detection When to terminate a simulation 2 Prof. Dr. J ürgen
More informationCovariance and Correlation
Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationStochastic Simulation Variance reduction methods Bo Friis Nielsen
Stochastic Simulation Variance reduction methods Bo Friis Nielsen Applied Mathematics and Computer Science Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: bfni@dtu.dk Variance reduction
More informationIE 581 Introduction to Stochastic Simulation
1. List criteria for choosing the majorizing density r (x) when creating an acceptance/rejection random-variate generator for a specified density function f (x). 2. Suppose the rate function of a nonhomogeneous
More informationOverall Plan of Simulation and Modeling I. Chapters
Overall Plan of Simulation and Modeling I Chapters Introduction to Simulation Discrete Simulation Analytical Modeling Modeling Paradigms Input Modeling Random Number Generation Output Analysis Continuous
More informationOperations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:
Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam 1 2 3 4 5 6 7 8 9 10 7 questions. 1. [5+5] Let X and Y be independent exponential random variables where X has
More informationProblem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q
Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable
More informationJointly Distributed Random Variables
Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,
More informationMFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators
MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators Thilo Klein University of Cambridge Judge Business School Session 4: Linear regression,
More informationISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16)
1 NAME ISyE 6644 Fall 2014 Test #2 Solutions (revised 11/7/16) This test is 85 minutes. You are allowed two cheat sheets. Good luck! 1. Some short answer questions to get things going. (a) Consider the
More informationSTOR Lecture 16. Properties of Expectation - I
STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics
More informationChapter 2 Queueing Theory and Simulation
Chapter 2 Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University,
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationX i. X(n) = 1 n. (X i X(n)) 2. S(n) n
Confidence intervals Let X 1, X 2,..., X n be independent realizations of a random variable X with unknown mean µ and unknown variance σ 2. Sample mean Sample variance X(n) = 1 n S 2 (n) = 1 n 1 n i=1
More informationMath 510 midterm 3 answers
Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e
More informationQueueing Theory and Simulation. Introduction
Queueing Theory and Simulation Based on the slides of Dr. Dharma P. Agrawal, University of Cincinnati and Dr. Hiroyuki Ohsaki Graduate School of Information Science & Technology, Osaka University, Japan
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationAnalysis of Software Artifacts
Analysis of Software Artifacts System Performance I Shu-Ngai Yeung (with edits by Jeannette Wing) Department of Statistics Carnegie Mellon University Pittsburgh, PA 15213 2001 by Carnegie Mellon University
More informationCPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017
CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Quote of the Day A person with one watch knows what time it is. A person with two
More informationCh. 5 Joint Probability Distributions and Random Samples
Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,
More informationSimulation. Where real stuff starts
Simulation Where real stuff starts March 2019 1 ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3
More informationQueuing Theory. Using the Math. Management Science
Queuing Theory Using the Math 1 Markov Processes (Chains) A process consisting of a countable sequence of stages, that can be judged at each stage to fall into future states independent of how the process
More informationQueuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011
Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general
More informationChapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model
Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state
More information18.440: Lecture 25 Covariance and some conditional expectation exercises
18.440: Lecture 25 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Outline Covariance and correlation A property of independence If X and Y
More informationLecture 4: September Reminder: convergence of sequences
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 4: September 6 In this lecture we discuss the convergence of random variables. At a high-level, our first few lectures focused
More informationOutput Data Analysis for a Single System
Output Data Analysis for a Single System Chapter 9 Based on the slides provided with the textbook 2 9.1 Introduction Output data analysis is often not conducted appropriately Treating output of a single
More informationSimple Linear Regression Estimation and Properties
Simple Linear Regression Estimation and Properties Outline Review of the Reading Estimate parameters using OLS Other features of OLS Numerical Properties of OLS Assumptions of OLS Goodness of Fit Checking
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationLecturer: Olga Galinina
Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential
More information1.225J J (ESD 205) Transportation Flow Systems
1.225J J (ESD 25) Transportation Flow Systems Lecture 9 Simulation Models Prof. Ismail Chabini and Prof. Amedeo R. Odoni Lecture 9 Outline About this lecture: It is based on R16. Only material covered
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 9 Verification and Validation of Simulation Models Purpose & Overview The goal of the validation process is: To produce a model that represents true
More informationON THE LAW OF THE i TH WAITING TIME INABUSYPERIODOFG/M/c QUEUES
Probability in the Engineering and Informational Sciences, 22, 2008, 75 80. Printed in the U.S.A. DOI: 10.1017/S0269964808000053 ON THE LAW OF THE i TH WAITING TIME INABUSYPERIODOFG/M/c QUEUES OPHER BARON
More informationLecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking
Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov
More informationSimulation of stationary processes. Timo Tiihonen
Simulation of stationary processes Timo Tiihonen 2014 Tactical aspects of simulation Simulation has always a goal. How to organize simulation to reach the goal sufficiently well and without unneccessary
More informationConstruction Operation Simulation
Construction Operation Simulation Lecture #8 Output analysis Amin Alvanchi, PhD Construction Engineering and Management Department of Civil Engineering, Sharif University of Technology Outline 2 Introduction
More informationλ λ λ In-class problems
In-class problems 1. Customers arrive at a single-service facility at a Poisson rate of 40 per hour. When two or fewer customers are present, a single attendant operates the facility, and the service time
More informationIEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.
IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection
More informationIntroduction to discrete probability. The rules Sample space (finite except for one example)
Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationIOE 202: lectures 11 and 12 outline
IOE 202: lectures 11 and 12 outline Announcements Last time... Queueing models intro Performance characteristics of a queueing system Steady state analysis of an M/M/1 queueing system Other queueing systems,
More informationEE 368. Weeks 3 (Notes)
EE 368 Weeks 3 (Notes) 1 State of a Queuing System State: Set of parameters that describe the condition of the system at a point in time. Why do we need it? Average size of Queue Average waiting time How
More information18.600: Lecture 24 Covariance and some conditional expectation exercises
18.600: Lecture 24 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Paradoxes: getting ready to think about conditional expectation Outline Covariance
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMassachusetts Institute of Technology
.203J/6.28J/3.665J/5.073J/6.76J/ESD.26J Quiz Solutions (a)(i) Without loss of generality we can pin down X at any fixed point. X 2 is still uniformly distributed over the square. Assuming that the police
More informationContents LIST OF TABLES... LIST OF FIGURES... xvii. LIST OF LISTINGS... xxi PREFACE. ...xxiii
LIST OF TABLES... xv LIST OF FIGURES... xvii LIST OF LISTINGS... xxi PREFACE...xxiii CHAPTER 1. PERFORMANCE EVALUATION... 1 1.1. Performance evaluation... 1 1.2. Performance versus resources provisioning...
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationBTRY 4830/6830: Quantitative Genomics and Genetics Fall 2014
BTRY 4830/6830: Quantitative Genomics and Genetics Fall 2014 Homework 4 (version 3) - posted October 3 Assigned October 2; Due 11:59PM October 9 Problem 1 (Easy) a. For the genetic regression model: Y
More information18.440: Lecture 26 Conditional expectation
18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions
More informationIntroduction to Queuing Theory. Mathematical Modelling
Queuing Theory, COMPSCI 742 S2C, 2014 p. 1/23 Introduction to Queuing Theory and Mathematical Modelling Computer Science 742 S2C, 2014 Nevil Brownlee, with acknowledgements to Peter Fenwick, Ulrich Speidel
More informationChapter 10 Verification and Validation of Simulation Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation
Chapter 10 Verification and Validation of Simulation Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose & Overview The goal of the validation process is: To produce a model that
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationCSE 4201, Ch. 6. Storage Systems. Hennessy and Patterson
CSE 4201, Ch. 6 Storage Systems Hennessy and Patterson Challenge to the Disk The graveyard is full of suitors Ever heard of Bubble Memory? There are some technologies that refuse to die (silicon, copper...).
More informationComputer Networks More general queuing systems
Computer Networks More general queuing systems Saad Mneimneh Computer Science Hunter College of CUNY New York M/G/ Introduction We now consider a queuing system where the customer service times have a
More informationStat 150 Practice Final Spring 2015
Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer
More informationConfidence Intervals for the Sample Mean
Confidence Intervals for the Sample Mean As we saw before, parameter estimators are themselves random variables. If we are going to make decisions based on these uncertain estimators, we would benefit
More informationUNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.
UNIVERSITY OF YORK MSc Examinations 2004 MATHEMATICS Networks Time Allowed: 3 hours. Answer 4 questions. Standard calculators will be provided but should be unnecessary. 1 Turn over 2 continued on next
More informationState-dependent and Energy-aware Control of Server Farm
State-dependent and Energy-aware Control of Server Farm Esa Hyytiä, Rhonda Righter and Samuli Aalto Aalto University, Finland UC Berkeley, USA First European Conference on Queueing Theory ECQT 2014 First
More informationQueuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe
Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What
More informationVariance reduction. Timo Tiihonen
Variance reduction Timo Tiihonen 2014 Variance reduction techniques The most efficient way to improve the accuracy and confidence of simulation is to try to reduce the variance of simulation results. We
More informationBulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1
Advanced Markovian queues Bulk input queue M [X] /M/ Bulk service queue M/M [Y] / Erlangian queue M/E k / Bulk input queue M [X] /M/ Batch arrival, Poisson process, arrival rate λ number of customers in
More informationReview: General Approach to Hypothesis Testing. 1. Define the research question and formulate the appropriate null and alternative hypotheses.
1 Review: Let X 1, X,..., X n denote n independent random variables sampled from some distribution might not be normal!) with mean µ) and standard deviation σ). Then X µ σ n In other words, X is approximately
More informationVectors Part 1: Two Dimensions
Vectors Part 1: Two Dimensions Last modified: 20/02/2018 Links Scalars Vectors Definition Notation Polar Form Compass Directions Basic Vector Maths Multiply a Vector by a Scalar Unit Vectors Example Vectors
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationUC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007
UC Berkeley Department of Electrical Engineering and Computer Science EE 26: Probablity and Random Processes Problem Set 9 Fall 2007 Issued: Thursday, November, 2007 Due: Friday, November 9, 2007 Reading:
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.
EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)
More informationChapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation
Chapter 11 Output Analysis for a Single Model Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose Objective: Estimate system performance via simulation If q is the system performance,
More informationLecturer: Olga Galinina
Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Motivation Modulated models; Continuous Markov models Markov modulated models; Batch Markovian arrival process; Markovian arrival process; Markov
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationSTA510. Solutions to exercises. 3, fall 2012
Solutions-3, 3. oktober 212; s. 1 STA51. Solutions to eercises. 3, fall 212 See R script le 'problem-set-3-h-12.r'. Problem 1 Histogram of y.non.risk The histograms to the right show the simulated distributions
More informationHITTING TIME IN AN ERLANG LOSS SYSTEM
Probability in the Engineering and Informational Sciences, 16, 2002, 167 184+ Printed in the U+S+A+ HITTING TIME IN AN ERLANG LOSS SYSTEM SHELDON M. ROSS Department of Industrial Engineering and Operations
More informationHomework 9 (due November 24, 2009)
Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x
More informationARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS
ARD: AN AUTOMATED REPLICATION-DELETION METHOD FOR SIMULATION ANALYSIS Emily K. Lada Anup C. Mokashi SAS Institute Inc. 100 SAS Campus Drive, R5413 Cary, NC 27513-8617, USA James R. Wilson North Carolina
More informationSTA442/2101: Assignment 5
STA442/2101: Assignment 5 Craig Burkett Quiz on: Oct 23 rd, 2015 The questions are practice for the quiz next week, and are not to be handed in. I would like you to bring in all of the code you used to
More informationIEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008
IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in
More information