2WB05 Simulation Lecture 7: Output analysis
|
|
- Dominick Johnston
- 5 years ago
- Views:
Transcription
1 2WB05 Simulation Lecture 7: Output analysis Marko Boon December 17, 2012
2 Outline 2/33 Output analysis of a simulation Confidence intervals Warm-up interval Common random numbers
3 Output analysis of a simulation 3/33 Confidence intervals Let X 1, X 2,..., X n be independent realizations of a random variable X with unknown mean µ and unknown variance σ 2. Sample mean X(n) = 1 n X i n Sample variance S 2 (n) = 1 n 1 Clearly X(n) is an estimator for the unknown mean µ. How can we construct a confidence interval for µ? i=1 n (X i X(n)) 2 i=1
4 Confidence intervals 4/33 Central limit theorem states that for large n ni=1 X i nµ σ n is approximately a standard normal random variable, and this remains valid if σ is replaced by S(n). Hence, let z β = 1 (β) (e.g., z = 1.96), then P( z 1 α/2 ni=1 X i nµ S(n) n z 1 α/2 ) 1 α or equivalently ( P X(n) z 1 α/2 S(n) n µ X(n) + z 1 α/2 S(n) n ) 1 α
5 Confidence intervals 5/33 Conclusion An approximate 100(1 α)% confidence interval for the unknown mean µ is given by X(n) ± z 1 α/2 S(n) n As a consequence, to obtain one extra digit of the parameter µ, the required simulation time increases with approximately a factor 100.
6 Confidence intervals 100 confidence intervals for the mean of uniform random variable on ( 1, 1); each interval is based on 100 observations /
7 Confidence intervals 7/33 Remark: If the observations X i are normally distributed, then ni=1 X i nµ S(n) n has for all n a Student s t distribution with n 1 degrees of freedom; so an exact confidence interval can be obtained by replacing z 1 α/2 by the corresponding quantile of the t distribution with n 1 degrees of freedom.
8 Confidence intervals Remark: Recursive computation of the sample mean and variance of the realizations X 1,..., X n of a random variable X: 8/33 and for n = 2, 3,..., where X(n) = n 1 n S 2 (n) = n 2 n 1 S2 (n 1) + 1 n X(n 1) + 1 n X n X(1) = X 1, S 2 (1) = 0. ( Xn X(n 1) ) 2
9 Warm-up interval 9/33 Problem of the initialization effect We are interested in the long-term behaviour of the system and maybe the choice of the initial state of the simulation will influence the quality of our estimate. One way of dealing with this problem is to choose N very large and to neglect this initialization effect. However, a better way is to throw away in each run the first k observations.
10 Warm-up interval Let W 1, W 2,..., W N be realizations of waiting times in a single run, and suppose we want to estimate the steady-state mean waiting time E(W ), defined as E(W ) = lim j E(W j ) 10/33 by the sample mean W N = 1 N N j=1 W j
11 Warm-up interval In this estimate there are two types of errors: Systematic error, or bias This means that E( W N ) = E(W ), due to the influence of the initial conditions, which may not be representative for steady-state behavior; Sampling (or random) error The estimator W N is of course a random variable. 11/33
12 Warm-up interval To reduce the systematic error, we delete the initial observations, say W 1,..., W k, and use the remaining observations W k+1,..., W N to estimate E(W ) by the truncated sample mean 12/33 W k,n = 1 N k N j=k+1 W j Then one expects that W k,n is less biased than W N, since the observations near the beginning of the simulation may not be representative for steady-state behavior; the parameter k is called the warm-up interval.
13 Warm-up interval 13/33 Choosing the warm-up interval We like to pick k such that E( W k,n ) E(W ). If k is too small, then E( W k,n ) may be significantly different from E(W ); If k is too large, then the variance of W k,n (the sampling error) may be too large (its variance is proportional to 1/(N k)).
14 Warm-up interval 14/33 Graphical procedure Our goal is determine a value k such that E(W j ) E(W ) for all j > k. The presence of variability of the process W 1, W 2,... makes it hard to determine k from a single run. Therefore, the idea is to make n independent replications (by using different random numbers) and employing the following steps:
15 Warm-up interval 15/33 1. Make n independent replications (or runs), each of length N; let W (i) j denote the j-th waiting time in run i. 2. Let W j = 1 n n i=1 The averaged process W 1, W 2,... has means and variances W (i) j E( W j ) = E(W j ), var( W j ) = var(w j )/n. So its mean behavior is the same as the original process, but it has a smaller (1/n-th) variance. 3. Plot W j and choose k such that beyond k the process W 1, W 2,... appears to have converged.
16 Warm-up interval Waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; 16/
17 Warm-up interval Averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is /
18 Warm-up interval Averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is /
19 Warm-up interval Averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is /
20 Warm-up interval To smooth high-frequency oscillations in W 1, W 2,... (but leave the trend) one may consider the moving average W j (w) (where w is the window size) defined as: 20/33 W j (w) = 1 2w + 1 w i= w W j+i for j = w + 1, w + 2,..., N w, and W j (w) = 1 2 j 1 j 1 i= ( j 1) W j+i for j = 1,..., w. The warm-up interval k can then be determined from the plot of W j (w) for j = 1,..., N w.
21 Warm-up interval Moving average of averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is 10 and window size is /
22 Warm-up interval Moving average of averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is 10 and window size is /
23 Warm-up interval Moving average of averaged waiting times for the M/M/1 queue with λ = 0.5 and µ = 1; number of replications is 100 and window size is /
24 Warm-up interval 24/33 Batch means Instead of doing n independent runs, we try to obtain n independent observations by making a single long run and, after deleting the first k observations, dividing this run into n subruns. The advantage is that we have to go through the warm-up period only once.
25 Batch means Let W 1, W 2,..., W nn be the output of a single run, where we have already deleted the first k observations and renumbered the remaining ones. Hence W 1, W 2,..., W nn will be representative for the steady-state. We divide the observations into n batches of length N. Thus, batch 1 consists of batch 2 of and so on. Let W (i) N W 1, W 2,..., W N ; W N+1, W N+2,..., W 2N, be the sample (or batch) mean of the N observations in batch i, so 25/33 W (i) N = 1 N i N j=(i 1)N +1 W j
26 Batch means The W (i) N s play the same role as the ones in the independent replication method. Unfortunately, the W (i) N s will now be dependent. But, under mild conditions, for large N the W (i) N s will be approximately independent, each with the same mean E(W ). Hence, for N large enough, it is reasonable to treat the W (i) N s as i.i.d. random variables with mean E(W ); thus W n,n ± z 1 δ/2 S n,n n 26/33 provides again a 100(1 δ)% confidence interval for E(W ), with W n,n and Sn,N 2 variance of the realizations W (1) N,..., W (n) N. again the sample mean and
27 Common random numbers If we compare the performance of two systems with random components it is in general better to evaluate both systems with the same realizations of the random components. If X and Y are estimators for the performance of the two systems, then var(x Y ) = var(x) + var(y ) 2 cov(x, Y ). In general, use of common random numbers leads to positively correlated X and Y : cov(x, Y ) > 0 Hence, the variance of X Y (and thus the corresponding confidence interval) will be smaller than in case X and Y are independent (when generated with independent random numbers). 27/33
28 Example: Job scheduling Suppose that N jobs have to be processed on M identical machines. The processing times are independent and exponentially distributed with mean 1. We want to compare the completion time of the last job, C max, under two different strategies: Longest Processing Time First (LPTF) Shortest Processing Time First (SPTF) 28/33 Remark: Let X 1,..., X N be independent exponentials with mean 1; denote the smallest one by X (1), the second smallest one by X (2), and so on. Then for i = 1,..., N X (i) d = YN + + Y N i+1 where Y 1,..., Y N are independent exponentials with E(Y i ) = 1/i (Y i is the minimum of i exponentials with mean 1).
29 Example: Job scheduling Results for M = 2, N = 10 without using common jobs for the SPTF and LPTF strategy (n is the number of experiments): and the results using common jobs: n Cmax S PT F Cmax L PT F mean st.dev. half width 95% CI n Cmax S PT F Cmax L PT F mean st.dev. half width 95% CI /33 Hence, using common jobs reduces the (sample) standard deviation of Cmax S PT F the confidence interval for its mean with a factor 5! Cmax L PT F, and thus the width of
30 Example: Job scheduling Results for M = 2, N = 10 without using common jobs for the SPTF and LPTF strategy (n is the number of experiments): and the results using common jobs: n Cmax S PT F Cmax L PT F mean st.dev. half width 95% CI Blah blah blah... Conclusion: common random numbers is better! n Cmax S PT F Cmax L PT F mean st.dev. half width 95% CI /33 Hence, using common jobs reduces the (sample) standard deviation of Cmax S PT F the confidence interval for its mean with a factor 5! Cmax L PT F, and thus the width of
31 Example: Production system We want to determine the reduction in the mean waiting time when we have an extra machine. To compare the two situations we want to use the same realizations of arrival times and processing times in the simulation. Assume that processing times of jobs are generated when they enter production. Then, the order in which arrival and processing times are generated depends on the number of machines: synchronization problem. 31/33 Solution: Use separate random number streams for different random variables (arrival and processing times); Design the simulation model such that it guarantees that exactly the same realizations of random variables are generated. In this problem, the second aproach can be realized by assigning to each job a processing time immediately upon arrival.
32 Example: Production system Results for λ = 4, µ = 1 and the number of machines M is 5, resp. 6. In each run N = 10 5 waiting times are generated; W (i) (i) i N M = 5 M = The standard deviation of the (i) is equal to If both systems use independent realizations of arrival and processing times the standard dev. of (i) is 0.029; so common random numbers yields a reduction of 25%. 32/33
33 Example: Production system Results for λ = 4, µ = 1 and the number of machines M is 5, resp. 6. In each run N = 10 5 waiting times are generated; W (i) (i) i N M = 5 M = Blah blah blah... Merry Christmas! The standard deviation of the (i) is equal to If both systems use independent realizations of arrival and processing times the standard dev. of (i) is 0.029; so common random numbers yields a reduction of 25%. 33/33
X i. X(n) = 1 n. (X i X(n)) 2. S(n) n
Confidence intervals Let X 1, X 2,..., X n be independent realizations of a random variable X with unknown mean µ and unknown variance σ 2. Sample mean Sample variance X(n) = 1 n S 2 (n) = 1 n 1 n i=1
More information7 Variance Reduction Techniques
7 Variance Reduction Techniques In a simulation study, we are interested in one or more performance measures for some stochastic model. For example, we want to determine the long-run average waiting time,
More informationB. Maddah INDE 504 Discrete-Event Simulation. Output Analysis (1)
B. Maddah INDE 504 Discrete-Event Simulation Output Analysis (1) Introduction The basic, most serious disadvantage of simulation is that we don t get exact answers. Two different runs of the same model
More informationCPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017
CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Quote of the Day A person with one watch knows what time it is. A person with two
More informationOverall Plan of Simulation and Modeling I. Chapters
Overall Plan of Simulation and Modeling I Chapters Introduction to Simulation Discrete Simulation Analytical Modeling Modeling Paradigms Input Modeling Random Number Generation Output Analysis Continuous
More informationOutput Data Analysis for a Single System
Output Data Analysis for a Single System Chapter 9 Based on the slides provided with the textbook 2 9.1 Introduction Output data analysis is often not conducted appropriately Treating output of a single
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 0 Output Analysis for a Single Model Purpose Objective: Estimate system performance via simulation If θ is the system performance, the precision of the
More informationNetwork Simulation Chapter 6: Output Data Analysis
Network Simulation Chapter 6: Output Data Analysis Prof. Dr. Jürgen Jasperneite 1 Contents Introduction Types of simulation output Transient detection When to terminate a simulation 2 Prof. Dr. J ürgen
More informationChapter 11. Output Analysis for a Single Model Prof. Dr. Mesut Güneş Ch. 11 Output Analysis for a Single Model
Chapter Output Analysis for a Single Model. Contents Types of Simulation Stochastic Nature of Output Data Measures of Performance Output Analysis for Terminating Simulations Output Analysis for Steady-state
More informationVariance reduction techniques
Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/ moltchan/modsim/ http://www.cs.tut.fi/kurssit/tlt-2706/ OUTLINE: Simulation with a given confidence;
More informationEE/PEP 345. Modeling and Simulation. Spring Class 11
EE/PEP 345 Modeling and Simulation Class 11 11-1 Output Analysis for a Single Model Performance measures System being simulated Output Output analysis Stochastic character Types of simulations Output analysis
More informationSimulation of stationary processes. Timo Tiihonen
Simulation of stationary processes Timo Tiihonen 2014 Tactical aspects of simulation Simulation has always a goal. How to organize simulation to reach the goal sufficiently well and without unneccessary
More informationESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Review performance measures (means, probabilities and quantiles).
ESTIMATION AND OUTPUT ANALYSIS (L&K Chapters 9, 10) Set up standard example and notation. Review performance measures (means, probabilities and quantiles). A framework for conducting simulation experiments
More informationISyE 6644 Fall 2014 Test 3 Solutions
1 NAME ISyE 6644 Fall 14 Test 3 Solutions revised 8/4/18 You have 1 minutes for this test. You are allowed three cheat sheets. Circle all final answers. Good luck! 1. [4 points] Suppose that the joint
More informationExample continued. Math 425 Intro to Probability Lecture 37. Example continued. Example
continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with
More informationVariance reduction techniques
Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Simulation with a given accuracy; Variance reduction techniques;
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Systems 2/49 Continuous systems State changes continuously in time (e.g., in chemical applications) Discrete systems State is observed at fixed regular
More informationSlides 12: Output Analysis for a Single Model
Slides 12: Output Analysis for a Single Model Objective: Estimate system performance via simulation. If θ is the system performance, the precision of the estimator ˆθ can be measured by: The standard error
More informationRedacted for Privacy
AN ABSTRACT OF THE THESIS OF Lori K. Baxter for the degree of Master of Science in Industrial and Manufacturing Engineering presented on June 4, 1990. Title: Truncation Rules in Simulation Analysis: Effect
More information11 The M/G/1 system with priorities
11 The M/G/1 system with priorities In this chapter we analyse queueing models with different types of customers, where one or more types of customers have priority over other types. More precisely we
More informationEstimation of uncertainties using the Guide to the expression of uncertainty (GUM)
Estimation of uncertainties using the Guide to the expression of uncertainty (GUM) Alexandr Malusek Division of Radiological Sciences Department of Medical and Health Sciences Linköping University 2014-04-15
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationAnalyzing Simulation Results
Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient
More informationIntroduction to Rare Event Simulation
Introduction to Rare Event Simulation Brown University: Summer School on Rare Event Simulation Jose Blanchet Columbia University. Department of Statistics, Department of IEOR. Blanchet (Columbia) 1 / 31
More informationEEC 686/785 Modeling & Performance Evaluation of Computer Systems. Lecture 19
EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Department of Electrical and Computer Engineering Cleveland State University wenbing@ieee.org (based on Dr. Raj Jain s lecture
More informationSEQUENTIAL ESTIMATION OF THE STEADY-STATE VARIANCE IN DISCRETE EVENT SIMULATION
SEQUENTIAL ESTIMATION OF THE STEADY-STATE VARIANCE IN DISCRETE EVENT SIMULATION Adriaan Schmidt Institute for Theoretical Information Technology RWTH Aachen University D-5056 Aachen, Germany Email: Adriaan.Schmidt@rwth-aachen.de
More informationChapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation
Chapter 11 Output Analysis for a Single Model Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose Objective: Estimate system performance via simulation If q is the system performance,
More informationSection 9.1. Expected Values of Sums
Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1
More information14 Random Variables and Simulation
14 Random Variables and Simulation In this lecture note we consider the relationship between random variables and simulation models. Random variables play two important roles in simulation models. We assume
More informationChapter 11 Estimation of Absolute Performance
Chapter 11 Estimation of Absolute Performance Purpose Objective: Estimate system performance via simulation If q is the system performance, the precision of the estimator qˆ can be measured by: The standard
More informationOutline. Simulation of a Single-Server Queueing System. EEC 686/785 Modeling & Performance Evaluation of Computer Systems.
EEC 686/785 Modeling & Performance Evaluation of Computer Systems Lecture 19 Outline Simulation of a Single-Server Queueing System Review of midterm # Department of Electrical and Computer Engineering
More informationComputer Networks More general queuing systems
Computer Networks More general queuing systems Saad Mneimneh Computer Science Hunter College of CUNY New York M/G/ Introduction We now consider a queuing system where the customer service times have a
More informationAnalysis of Simulated Data
Analysis of Simulated Data Media Media di una popolazione: somma di tutti i valori delle variabili della popolazione diviso il numero di unità della popolazione (N) µ N i= = 1 N X i Dove: - N = numero
More informationQualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama
Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours
More informationIE 581 Introduction to Stochastic Simulation. Three pages of hand-written notes, front and back. Closed book. 120 minutes.
Three pages of hand-written notes, front and back. Closed book. 120 minutes. 1. SIMLIB. True or false. (2 points each. If you wish, write an explanation of your thinking.) (a) T F The time of the next
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter Comparison and Evaluation of Alternative System Designs Purpose Purpose: comparison of alternative system designs. Approach: discuss a few of many statistical
More informationM(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1
Math 66/566 - Midterm Solutions NOTE: These solutions are for both the 66 and 566 exam. The problems are the same until questions and 5. 1. The moment generating function of a random variable X is M(t)
More informationMethods of evaluating estimators and best unbiased estimators Hamid R. Rabiee
Stochastic Processes Methods of evaluating estimators and best unbiased estimators Hamid R. Rabiee 1 Outline Methods of Mean Squared Error Bias and Unbiasedness Best Unbiased Estimators CR-Bound for variance
More informationCPU Scheduling Exercises
CPU Scheduling Exercises NOTE: All time in these exercises are in msec. Processes P 1, P 2, P 3 arrive at the same time, but enter the job queue in the order presented in the table. Time quantum = 3 msec
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationLecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2
Lecture 2: Basic Concepts and Simple Comparative Experiments Montgomery: Chapter 2 Fall, 2013 Page 1 Random Variable and Probability Distribution Discrete random variable Y : Finite possible values {y
More informationLecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages
Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationInitial Transient Phase of Steady state Simulation: Methods of Its Length Detection and Their Evaluation in Akaroa2
Initial Transient Phase of Steady state Simulation: Methods of Its Length Detection and Their Evaluation in Akaroa2 A Thesis Submitted in Partial Fulfilment of the Requirement for the Degree of Master
More informationQueueing Review. Christos Alexopoulos and Dave Goldsman 10/25/17. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA
1 / 26 Queueing Review (mostly from BCNN) Christos Alexopoulos and Dave Goldsman Georgia Institute of Technology, Atlanta, GA, USA 10/25/17 2 / 26 Outline 1 Introduction 2 Queueing Notation 3 Transient
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More information7.1 Basic Properties of Confidence Intervals
7.1 Basic Properties of Confidence Intervals What s Missing in a Point Just a single estimate What we need: how reliable it is Estimate? No idea how reliable this estimate is some measure of the variability
More informationQueueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA
1 / 24 Queueing Review (mostly from BCNN) Christos Alexopoulos and Dave Goldsman Georgia Institute of Technology, Atlanta, GA, USA 10/6/16 2 / 24 Outline 1 Introduction 2 Queueing Notation 3 Transient
More informationSTAT 830 Non-parametric Inference Basics
STAT 830 Non-parametric Inference Basics Richard Lockhart Simon Fraser University STAT 801=830 Fall 2012 Richard Lockhart (Simon Fraser University)STAT 830 Non-parametric Inference Basics STAT 801=830
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationMore on Input Distributions
More on Input Distributions Importance of Using the Correct Distribution Replacing a distribution with its mean Arrivals Waiting line Processing order System Service mean interarrival time = 1 minute mean
More informationCSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.
() () a. X is a binomial distribution with n = 000, p = /6 b. The expected value, variance, and standard deviation of X is: E(X) = np = 000 = 000 6 var(x) = np( p) = 000 5 6 666 stdev(x) = np( p) = 000
More informationExercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems
Exercises Solutions Note, that we have not formulated the answers for all the review questions. You will find the answers for many questions by reading and reflecting about the text in the book. Chapter
More informationNon Markovian Queues (contd.)
MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where
More informationLecture 8. October 22, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.
Lecture 8 Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University October 22, 2007 1 2 3 4 5 6 1 Define convergent series 2 Define the Law of Large Numbers
More informationComparing two independent samples
In many applications it is necessary to compare two competing methods (for example, to compare treatment effects of a standard drug and an experimental drug). To compare two methods from statistical point
More informationProblem 1 (20) Log-normal. f(x) Cauchy
ORF 245. Rigollet Date: 11/21/2008 Problem 1 (20) f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.0 0.2 0.4 0.6 0.8 4 2 0 2 4 Normal (with mean -1) 4 2 0 2 4 Negative-exponential x x f(x) f(x) 0.0 0.1 0.2 0.3 0.4 0.5
More informationCPU Scheduling. CPU Scheduler
CPU Scheduling These slides are created by Dr. Huang of George Mason University. Students registered in Dr. Huang s courses at GMU can make a single machine readable copy and print a single copy of each
More informationELEG 305: Digital Signal Processing
ELEG 305: Digital Signal Processing Lecture 19: Lattice Filters Kenneth E. Barner Department of Electrical and Computer Engineering University of Delaware Fall 2008 K. E. Barner (Univ. of Delaware) ELEG
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran November 13, 2014 Midterm Exam 2 Last name First name SID Rules. DO NOT open the exam until instructed
More informationCourse information: Instructor: Tim Hanson, Leconte 219C, phone Office hours: Tuesday/Thursday 11-12, Wednesday 10-12, and by appointment.
Course information: Instructor: Tim Hanson, Leconte 219C, phone 777-3859. Office hours: Tuesday/Thursday 11-12, Wednesday 10-12, and by appointment. Text: Applied Linear Statistical Models (5th Edition),
More informationAsymptotic Statistics-III. Changliang Zou
Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (
More informationOn Reaching Nirvana (a.k.a. on Reaching a Steady State) Moshe Pollak Joint work (in progress) with Tom Hope
On Reaching Nirvana (a.k.a. on Reaching a Steady State) Moshe Pollak Joint work (in progress) with Tom Hope 1 What is Nirvana? "ביקש יעקב אבינו לישב בשלוה" רש"י, בראשית לז 2 the Pursuit of Happiness (US
More informationPractice Problem - Skewness of Bernoulli Random Variable. Lecture 7: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example
A little more E(X Practice Problem - Skewness of Bernoulli Random Variable Lecture 7: and the Law of Large Numbers Sta30/Mth30 Colin Rundel February 7, 014 Let X Bern(p We have shown that E(X = p Var(X
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationDelta Method. Example : Method of Moments for Exponential Distribution. f(x; λ) = λe λx I(x > 0)
Delta Method Often estimators are functions of other random variables, for example in the method of moments. These functions of random variables can sometimes inherit a normal approximation from the underlying
More informationECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic
ECE 3511: Communications Networks Theory and Analysis Fall Quarter 2002 Instructor: Prof. A. Bruce McDonald Lecture Topic Introductory Analysis of M/G/1 Queueing Systems Module Number One Steady-State
More informationStat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS
Stat 135, Fall 2006 A. Adhikari HOMEWORK 6 SOLUTIONS 1a. Under the null hypothesis X has the binomial (100,.5) distribution with E(X) = 50 and SE(X) = 5. So P ( X 50 > 10) is (approximately) two tails
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationCSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)
CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume
More informationStatistics 3657 : Moment Approximations
Statistics 3657 : Moment Approximations Preliminaries Suppose that we have a r.v. and that we wish to calculate the expectation of g) for some function g. Of course we could calculate it as Eg)) by the
More informationChapter 9. Non-Parametric Density Function Estimation
9-1 Density Estimation Version 1.2 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least
More informationChapter 6: CPU Scheduling
Chapter 6: CPU Scheduling Basic Concepts Scheduling Criteria Scheduling Algorithms Multiple-Processor Scheduling Real-Time Scheduling Algorithm Evaluation 6.1 Basic Concepts Maximum CPU utilization obtained
More informationChapter 9. Non-Parametric Density Function Estimation
9-1 Density Estimation Version 1.1 Chapter 9 Non-Parametric Density Function Estimation 9.1. Introduction We have discussed several estimation techniques: method of moments, maximum likelihood, and least
More informationLecture 8 Sampling Theory
Lecture 8 Sampling Theory Thais Paiva STA 111 - Summer 2013 Term II July 11, 2013 1 / 25 Thais Paiva STA 111 - Summer 2013 Term II Lecture 8, 07/11/2013 Lecture Plan 1 Sampling Distributions 2 Law of Large
More informationNon-parametric Inference and Resampling
Non-parametric Inference and Resampling Exercises by David Wozabal (Last update. Juni 010) 1 Basic Facts about Rank and Order Statistics 1.1 10 students were asked about the amount of time they spend surfing
More informationSTA510. Solutions to exercises. 3, fall 2012
Solutions-3, 3. oktober 212; s. 1 STA51. Solutions to eercises. 3, fall 212 See R script le 'problem-set-3-h-12.r'. Problem 1 Histogram of y.non.risk The histograms to the right show the simulated distributions
More information2/5/07 CSE 30341: Operating Systems Principles
page 1 Shortest-Job-First (SJR) Scheduling Associate with each process the length of its next CPU burst. Use these lengths to schedule the process with the shortest time Two schemes: nonpreemptive once
More information(u v) = f (u,v) Equation 1
Problem Two-horse race.0j /.8J /.5J / 5.07J /.7J / ESD.J Solution Problem Set # (a). The conditional pdf of U given that V v is: The marginal pdf of V is given by: (u v) f (u,v) Equation f U V fv ( v )
More informationSTAT 135 Lab 5 Bootstrapping and Hypothesis Testing
STAT 135 Lab 5 Bootstrapping and Hypothesis Testing Rebecca Barter March 2, 2015 The Bootstrap Bootstrap Suppose that we are interested in estimating a parameter θ from some population with members x 1,...,
More informationUnit 5a: Comparisons via Simulation. Kwok Tsui (and Seonghee Kim) School of Industrial and Systems Engineering Georgia Institute of Technology
Unit 5a: Comparisons via Simulation Kwok Tsui (and Seonghee Kim) School of Industrial and Systems Engineering Georgia Institute of Technology Motivation Simulations are typically run to compare 2 or more
More informationSTAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots. March 8, 2015
STAT 135 Lab 6 Duality of Hypothesis Testing and Confidence Intervals, GLRT, Pearson χ 2 Tests and Q-Q plots March 8, 2015 The duality between CI and hypothesis testing The duality between CI and hypothesis
More informationi=1 X i/n i=1 (X i X) 2 /(n 1). Find the constant c so that the statistic c(x X n+1 )/S has a t-distribution. If n = 8, determine k such that
Math 47 Homework Assignment 4 Problem 411 Let X 1, X,, X n, X n+1 be a random sample of size n + 1, n > 1, from a distribution that is N(µ, σ ) Let X = n i=1 X i/n and S = n i=1 (X i X) /(n 1) Find the
More informationExact Statistics of a Markov Chain trhough Reduction in Number of States: Marco Aurelio Alzate Monroy Universidad Distrital Francisco José de Caldas
Exact Statistics of a Markov Chain trhough Reduction in Number of States: A Satellite On-board Switching Example Marco Aurelio Alzate Monroy Universidad Distrital Francisco José de Caldas The general setting
More informationMidterm Exam 1 Solution
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:
More informationMonte Carlo Methods for Stochastic Programming
IE 495 Lecture 16 Monte Carlo Methods for Stochastic Programming Prof. Jeff Linderoth March 17, 2003 March 17, 2003 Stochastic Programming Lecture 16 Slide 1 Outline Review Jensen's Inequality Edmundson-Madansky
More informationKendall notation. PASTA theorem Basics of M/M/1 queue
Elementary queueing system Kendall notation Little s Law PASTA theorem Basics of M/M/1 queue 1 History of queueing theory An old research area Started in 1909, by Agner Erlang (to model the Copenhagen
More informationST 371 (IX): Theories of Sampling Distributions
ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about
More informationComparison of Two Samples
2 Comparison of Two Samples 2.1 Introduction Problems of comparing two samples arise frequently in medicine, sociology, agriculture, engineering, and marketing. The data may have been generated by observation
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version
More informationEC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)
1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For
More informationExercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov
Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury
More informationMassachusetts Institute of Technology
6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first
More informationUnbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.
Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it
More informationAN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM
AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM N.V. KRYLOV In the first section we explain why the central limit theorem for the binomial 1/2 distributions is natural. The second section contains
More informationGlossary availability cellular manufacturing closed queueing network coefficient of variation (CV) conditional probability CONWIP
Glossary availability The long-run average fraction of time that the processor is available for processing jobs, denoted by a (p. 113). cellular manufacturing The concept of organizing the factory into
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationREAL-TIME DELAY ESTIMATION BASED ON DELAY HISTORY SUPPLEMENTARY MATERIAL
REAL-TIME DELAY ESTIMATION BASED ON DELAY HISTORY SUPPLEMENTARY MATERIAL by Rouba Ibrahim and Ward Whitt IEOR Department Columbia University {rei2101, ww2040}@columbia.edu Abstract Motivated by interest
More information1 of 7 7/16/2009 6:12 AM Virtual Laboratories > 7. Point Estimation > 1 2 3 4 5 6 1. Estimators The Basic Statistical Model As usual, our starting point is a random experiment with an underlying sample
More information