Basis for simulation techniques

Similar documents
Topic 9: Sampling Distributions of Estimators

Topic 9: Sampling Distributions of Estimators

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Probability and statistics: basic terms

Topic 9: Sampling Distributions of Estimators

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Module 1 Fundamentals in statistics

MATH 320: Probability and Statistics 9. Estimation and Testing of Parameters. Readings: Pruim, Chapter 4

Econ 325/327 Notes on Sample Mean, Sample Proportion, Central Limit Theorem, Chi-square Distribution, Student s t distribution 1.

Random Variables, Sampling and Estimation

This section is optional.

Chapter 11 Output Analysis for a Single Model. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

4. Partial Sums and the Central Limit Theorem

Stat 421-SP2012 Interval Estimation Section

Joint Probability Distributions and Random Samples. Jointly Distributed Random Variables. Chapter { }

Stat 319 Theory of Statistics (2) Exercises

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Lecture 3. Properties of Summary Statistics: Sampling Distribution

7.1 Convergence of sequences of random variables

Statisticians use the word population to refer the total number of (potential) observations under consideration

Chapter 6 Sampling Distributions

The standard deviation of the mean

Econ 325 Notes on Point Estimator and Confidence Interval 1 By Hiro Kasahara

Convergence of random variables. (telegram style notes) P.J.C. Spreij

STAT 350 Handout 19 Sampling Distribution, Central Limit Theorem (6.6)

7-1. Chapter 4. Part I. Sampling Distributions and Confidence Intervals

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

1.010 Uncertainty in Engineering Fall 2008

Lecture 7: Properties of Random Samples

Lecture 19: Convergence

Output Analysis (2, Chapters 10 &11 Law)

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

Mathematical Statistics - MS

Estimation for Complete Data

1 Inferential Methods for Correlation and Regression Analysis

7.1 Convergence of sequences of random variables

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

Simulation. Two Rule For Inverting A Distribution Function

Distribution of Random Samples & Limit theorems

MATH/STAT 352: Lecture 15

There is no straightforward approach for choosing the warmup period l.

LECTURE 8: ASYMPTOTICS I

Expectation and Variance of a random variable

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].


Department of Mathematics

Statistics 511 Additional Materials

Binomial Distribution

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

This is an introductory course in Analysis of Variance and Design of Experiments.

AMS570 Lecture Notes #2


ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

Infinite Sequences and Series

Chapter 8: Estimating with Confidence

Goodness-of-Fit Tests and Categorical Data Analysis (Devore Chapter Fourteen)

EE 4TM4: Digital Communications II Probability Theory

Chapter 6. Sampling and Estimation

A Question. Output Analysis. Example. What Are We Doing Wrong? Result from throwing a die. Let X be the random variable

Frequentist Inference

ENGI 4421 Confidence Intervals (Two Samples) Page 12-01

Advanced Stochastic Processes.

Interval Estimation (Confidence Interval = C.I.): An interval estimate of some population parameter is an interval of the form (, ),

Exam II Covers. STA 291 Lecture 19. Exam II Next Tuesday 5-7pm Memorial Hall (Same place as exam I) Makeup Exam 7:15pm 9:15pm Location CB 234

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

32 estimating the cumulative distribution function

Confidence Level We want to estimate the true mean of a random variable X economically and with confidence.

The Sample Variance Formula: A Detailed Study of an Old Controversy

Chapter 2 The Monte Carlo Method

Quick Review of Probability

Generalized Semi- Markov Processes (GSMP)

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

April 18, 2017 CONFIDENCE INTERVALS AND HYPOTHESIS TESTING, UNDERGRADUATE MATH 526 STYLE

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

A statistical method to determine sample size to estimate characteristic value of soil parameters

Lecture 18: Sampling distributions

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Direction: This test is worth 250 points. You are required to complete this test within 50 minutes.

Lecture 6 Chi Square Distribution (χ 2 ) and Least Squares Fitting

6.3 Testing Series With Positive Terms

Quick Review of Probability

Parameter, Statistic and Random Samples

Bayesian Methods: Introduction to Multi-parameter Models

5. Likelihood Ratio Tests

Lecture 2: Monte Carlo Simulation

Computing Confidence Intervals for Sample Data

Output Analysis and Run-Length Control

Math 61CM - Solutions to homework 3

Parameter, Statistic and Random Samples

6.041/6.431 Spring 2009 Final Exam Thursday, May 21, 1:30-4:30 PM.

An Introduction to Randomized Algorithms

Big Picture. 5. Data, Estimates, and Models: quantifying the accuracy of estimates.

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

SDS 321: Introduction to Probability and Statistics

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

KLMED8004 Medical statistics. Part I, autumn Estimation. We have previously learned: Population and sample. New questions

Section 9.2. Tests About a Population Proportion 12/17/2014. Carrying Out a Significance Test H A N T. Parameters & Hypothesis

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

Confidence Intervals for the Population Proportion p

Transcription:

Basis for simulatio techiques M. Veeraraghava, March 7, 004 Estimatio is based o a collectio of experimetal outcomes, x, x,, x, where each experimetal outcome is a value of a radom variable. x i. Defiitios [3], page 47 Radom sample: The set of radom variables,,, is said to costitute a radom sample of size from the populatio with the distributio fuctio F( x) provided they are mutually idepedet ad idetically distributed with the distributio fuctio ( x) Fx ( ) for all i ad x. Statistic: Ay fuctio T (,,, ) of the observatios,,, is a statistic. Estimator: Ay statistic Θˆ Θ(, ˆ,, ) used to estimate the value of a parameter θ of the populatio is called a estimator of θ. A observed value θˆ θˆ ( x, x,, x ) is kow as a estimate of θ. F i. Estimatio methods. Method of momets - poit estimate Suppose oe or more parameters of the distributio of are to be estimated based o a radom sample of size. Defie the k th sample momet of to be: M k ' k ----- k,,. () k The k th populatio momet is: µ k ' E [ k ] k,,, which is a fuctio of ukow parameters. ()

The method of momets cosists of equatig the first few populatio momets with the correspodig sample momets. Use as may equatios as ukows ad solve simultaeous equatios.. Cofidece itervals ([3], page 484) - iterval estimate We describe four ways of obtaiig cofidece itervals i the followig subsectios... Chebyshev s iequality Var[ Θˆ ] P( Θˆ ε < θ < Θˆ + ε) ------------------ where Θˆ is the estimator of the parameter θ. This ca be used for ay estimator. ε (3) As a example, applyig this iequality to obtai the populatio mea θ µ, usig the sample mea as a estimator, Θˆ, ad the parameter beig estimated is the mea θ µ, the assumig the populatio variace is, Var[ ] σ, the iequality becomes: σ P ( ε < θ < + ε) ----------------- Var[ ] ε σ P ( ε < θ < + ε) ------- ε (4) (5) See Sectio 3. for why Var[ ] σ... Uderlyig radom variable has a ormal distributio + populatio variace is kow I geeral, cofidece itervals obtaied by Chebyshev s iequality ca be improved if the distributio of s kow. Here are the steps for obtaiig a cofidece iterval for parameter θ :. Fid a radom variable that is a fuctio of,,, : T T(,,, ; θ) (6) such that the distributio of T is kow.. Fid umbers a ad b such that: Pa ( < T< b) γ. (7)

3. After samplig the values x i of, determie the rage of values that θ ca take o while maitaiig the coditio This rage is a 00γ% cofidece iterval of θ. a < t( θ) < b, where (8) t( θ) Tx (, x,, x ; θ). (9) Thus, if s kow to have a ormal distributio, N ( µσ, ), the the sample mea is N ( µσ, ) ad Z (( µ ) ( σ ) ) has the stadard ormal distributio N( 0, ) (see Theorem 3.6 below). Therefore, Pa ( < Z< b) γ is the 00γ % CI (ote Z is the T fuctio of (6)). (0) x µ a < Z< b or a < -------------- < b or x ------ bσ < µ < x a------ σ () σ By choosig a z ad b z, which is the umber of stadar deviatios from the mea oe must go i order to cotai 00γ % of the probability mass, we get the equatio i how-to-simulate.doc: M zσ M µ M + zσ M () where σ M σ ( ) ad z is the umber of stadard deviatios from the mea oe must go i a ormal distributio N( 0, ) to cotai 00γ % of the probability mass. I other words PM ( zσ M µ M+ zσ M ) γ. So if γ 0.95, z.96. I other words, we are 95% sure that parameter µ, which is beig estimated, lies i that rage. Theorem 3.6 [3], page 70: Let,,, be mutually idepedet radom variables such that N ( µ i, σ i ), i,,,. The S is ormally distributed, that is S N ( µ, σ ), where

µ µ σ σ i i (3) Example 3.30 [3]: Sample mea S. It ca be show that sice S has the distributio S Nµ (, σ ) ad f f S ( x), has the distributio N ( µσ, ) ad the radom variable (( µ ) ( σ ) ) has the distributio N( 0, ). To prove the statemet i the above example, use the results from Sectio 3., where is a radom variable that is a fuctio of aother radom variable S, whose pdf is kow (by theorem 3.6 above). Sice S, we ca write f f S ( x) usig the results of Sectio 3.. We kow that S Nµ (, σ ). Therefore, we have: f ( x) ---------------------e π σ ( x µ ) ---------------------------- σ (4) f ( x) -------------- e πσ ( x µ ) --------------------- ( σ ). (5) Therefore, N ( µσ, ) I Sectio 3., we showed that the Var[ ] beig equal to σ does ot require the s to be ormally distributed. Here, whe the s are ormally distributed, we agai see that the variace of the r.v. s σ...3 Uderlyig radom variable has a ukow distributio but populatio variace is kow Eve without the s beig ormally distributed, because of the Cetral Limit Theorem, the sample mea, as a fuctio of radom variables, has a ormal distributio as. Cetral Limit Theorem ([3], page 7):

Let,,, be mutually idepedet radom variables with a fiite mea E [ i ] µ i ad a fiite variace Var[ ] σ i,,,,. We form the ormalized radom variable: Z so that EZ [ ] 0 ad Var[ Z ]. The, uder certai regularity coditios, the limitig distributio of Z is stadard ormal deoted Z N( 0, ), i.e. σ i µ i ---------------------------------- (6) lim () t PZ ( t) F Z t ---------- e y dy π (7) Special case: Let,,, be iid with a commo mea µ E [ i ] ad commo variace σ Var[ ], the (6) becomes Z ( µ ) ------------------------ σ (8) where is the sample mea. Therefore the sample mea from radom samples ted toward ormality as the sample size icreases. Give the Cetral Limit Theorem, the same cofidece iterval as i Sectio.. works here: PM ( zσ M µ M+ zσ M ) γ (9) where σ M σ ( )...4 Cofidece iterval whe populatio variace is ukow Example 3.35 of page 78 [3]: Assume that,,, be mutually idepedet idetically distributed radom variables such that N ( µσ, ). The it follows that V ( ------------------------ µ ) σ (0)

has the stadard ormal distributio (see example o first page). From example 3.33, ( )S --------------------- W σ -------------- σ () has the chi-squared distributio with degrees of freedom (prove this?). It follows that: T V ( µ )(( ) σ) -------------------- ------------------------------------------ has the -distributio with ---------------- W S--------------- ------------------ ( µ ) t S ( ) ( ) σ degrees of freedom, where ( ) S ------------------------------ () ( µ ) P t ; α < ------------------ <, or (3) S ( ) t ; α α t ; α S µ t M + ; α σs M (4) where S M S ( ) ad t is the value of a t-distributed radom variable with ; α degrees of freedom such that 00( α) % of the probability mass of the radom variable is cotaied betwee ( t t ; α,. ; α ) Theorem 3.7 [3], page 7: If,,, is a sequece of mutually idepedet, stadard ormal variables, the Y (5) has the gamma distributio, GAM(, ), or the chi-square distributio with degrees of freedom Y χ.

If,,, are ot ormal, the above theorem does ot hold as strogly as does the CLT for []. Theorem 3.0. If V ad W are idepedet radom variables such that V N( 0, ) ad W χ, the the radom variable: T V --------------- W (6) has the t distributio with degrees of freedom. Example 3.3: Let,,, be a sequece of mutually idepedet, ormal variables N µσ (, ). The radom variables Z i µ i ------------- are stadard ormal. Therefore σ Y Z i ( µ ) --------------------- σ (7) has the chi-square distributio with degrees of freedom Y χ. We typically do ot kow µ, the populatio mea. Therefore, replace µ by the sample mea i. Defie a ra- dom variable Referece [] gives a theorem: S σ ----------- -------------- σ (8) Theorem 6.: Let Let,,, be a sequece of mutually idepedet, ormal variables N ( µσ, ) for all i. Let -- i be the sample mea, ad (9)

The: S ----------- be the sample variace. (30) ( i ). ad S are idepedet.. ( )S --------------------- σ -------------- Chi square( ) σ Questio:. V is stadard ormal eve if s are ot ormally distributed provided is large because of CLT, but does Theorem 3.7 hold eve if Note: o-ormal distributios []: s are ot ormally distributed? The t-based cofidece iterval procedures are ofte applied whe,,, are ot draw from a ormal distributio. This is acceptable i large samples if the distributio of is reasoably symmetric. However the procedures are ot valid for highly skewed distributios. What about Pareto? File sizes follow the Pareto distributio!,,, Exercise: Work out Problem o page 49 of [3] i class...5 Depedet samples [3], page 503: I all the previous sub-sectios to fid cofidece itervals, we assumed that the samples were idepedet. But i most of our simulatios, this is ot likely to be true. I this case, the variace is o loger. If we assume that the sequece is wide-sese statioary, the autocovariace fuctio σ K j i E µ [( )( j µ )] Cov(, j ) (3) is fiite ad is a fuctio of oly i j. The variace of the sample mea is:

Var[ ] ---- Var[ ] + ( i, j ) ( i j) Cov(, j ) (3) because Var[ + Y] E[ (( + Y) E [ + Y] ) ] Var[ + Y] E[ (( + Y) E [ ] EY [ ]) ] Var[ + Y] E[ ( E[ ] ) + ( Y E[ Y] ) + ( E[ ] )( Y E[ Y] )] Var[ + Y] E[ ( E[ ] ) ] + E[ ( Y E[ Y] ) ] + E[ ( E[ ] )( Y E[ Y] )] (33) (34) (35) (36) Comig back to (3): Var[ + Y] Var[ ] + Var[ Y] + Cov(, Y) (37) Write out secod term i (3): Var[ ] σ ----- + -- -- j Kj (38) Cov (, ) j Cov( ) + Cov( 3 ) + + Cov( ) + ( ij, ) ( i j) (39) Cov( ) + Cov( 3 ) + + Cov( ) + + Cov( ) + Cov( ) + + Cov( ) Cosider how may terms have i j of. This is o the first lie, o the secod, for each subsequet lie util last but oe. The last lie agai has oly oe such term. Therefore, K occurs Cosider the multiplicative factor for + ( ). (40) K i (38). It is

. (4) -- -- ------------------- ( ) Give the factor i the deomiator i (3), this checks out. Cosider the multiplicative factor of K. From (39), we see that the first two rows oly have such term as do the last two rows. The itermediate rows will have two such terms, e.g., 3 ad 3 5. Thus from (39), the multiplicative factor of K is + ( 4) + 4. From (38), we see this factor is: Checks out! So (38) is equivalet to (3). As,. (4) -- -- ------------------- ( ) lim Var[ ] σ + K j aσ K, where a + ----- j (43) σ j It ca be show that uder rather geeral coditios, the statistic: j ------------ µ σ a -- (44) of the correlated data approaches the stadard ormal distributio. Therefore, the 00( α) % CI for µ is give by: ± σz α a -- (45) We ca avoid havig to estimate σ a by usig the method of idepedet replicatios. Replicate a experimet m times, with each experimet cotaiig observatios. If the seed i the m experimets are chose radomly, the the results of differet experimets will be idepedet though the observatios i a experimet will be depedet. Let the i th observatio i the j th

experimet be () j. Let the sample mea ad sample variace of the j th experimet be j () ad S () j, where: j () -- i () j ad (46) S () j ----------- [ i () j j ()] (47) Sice ( ), ( ), m ( ) are idepedet (ad idetically distributed) From the idividual sample meas, we obtai a estimator for the populatio mea µ as m --- j () m j m ------ m i () j j (48) m V ------------ [ j () ] m j ------------ ( j ()) m m ------------ ( ) m Sice a estimate of the variace is used, we use the t-distributio. Therefore the statistic m j (49) ( µ )(( m) V) is approximately t-distributed with ( m ) degrees of freedom. The 00( α) % CI is x ± ----------------------- (50) m This is actually oe of six approaches to hadle this depedece of samples problem [8] (see basic.pdf ad others.pdf file posted o web site, which are extracted pages from [8]). Also see [0]. They call this approach replicatio/deletio ad suggest deletig some iitial sample poits to get rid of trasiet behavior. t m ; α v Batch meas method: Igore the first few sample poits for trasiets. The take the whole (steady state) process of sample poits ad divide them ito k batches with size of k sam-

ples. The "sample meas" of those batches/segmets ca roughly be treated as idepedet, if k is sufficiet large. The key for usig batch meas is to select large eough batch sizes. See [9] for rule of thumb o how large to make the batch sizes. The PostNotes3.doc file [9] o CI calculatio explais the details. I that file, the umber of observatios per batch k is suggested to be at least 4t where choice of t is umber of observatios before correlatio dies out (decays to almost zero). This meas that correlatio should be computed ad the batch size determied. I simulatios that oe of my research studets ra, correlatio died out i 00 samples; this meas each batch size should be 800. Spectrum aalysis method [8]: ˆ Obtai a estimator for Var[ ] i (38) by replacig K j with a estimator K j obtaied from the samples: K ˆ j j [ ( )][ + j ( )] ----------------------------------------------------------------------- j (5) ad a estimator for σ i (38) as S M ---------------------------- ( ). Plug K ˆ istead of ad istead of N j K j S σ ito (38) ad get a estimate v of the Var[ ]. The 00( α) % CI for the mea is x ± -----------------------. (5) m Matlab correlatio fuctios (maybe corr) ca be used to compute correlatio. Correlatio ρ Y Cov( x, Y) ----------------------------------------. Var( )Var( Y) t m ; α v A example: For our research work o VBLS, here is what we did to determie how log to apply the replicatio/deletio approach. Each simulatio ru was executed for 6000sec because of the followig computatio. Our largest file size was GB file ad miimum badwidth was 00Mbps, which meas the maximum trasfer time is 80s. If we wat 00 such samples, we eed to at least simulate 00*80 sec 6000 sec. With a call arrival rate of 50 calls/sec, we eed at least 50*6000 or 900000 files. The first 0% of the data was dropped. This was arbitrary. My studet said it did t

impact results greatly. He could have dropped just 0%. The umber of replicatios was 5. I other words, my studet executed 5 rus for each lambda (call arrival rate). He computed the mea of file trasfer delays from all the sample poits i each ru. The he took these 5 meas ad computed aother mea ad the CI usig the t-distributio formula with 4 as the degrees of freedom. The five meas ca be assumed to be idepedet sice they are from differet rus. Built-i matlab fuctio ormfit ca be used for CI calculatio. 3. Appedices 3. Appedix I: Fid the pdf ad CDF of a fuctio of a radom variable whose pdf ad CDF are kow Theorem 3. (page 40, [3]): Let be a cotiuous r.v. with desity that is ozero o a subset I of real umbers (that is f ( x) > 0, x I ad f ( x) 0 for x I. Let Φ be a differetiable mootoic fuctio whose domai is I ad whose rage is the set of reals. The Y Φ( ) is a f cotiuous radom variable with desity f Y give by: f Y ( y) f [ Φ ( y) ][ ( Φ )'( y) ] y Φ() I. (53) 0 otherwise Proof: Assume Φ( ) is a icreasig fuctio. F Y ( y) PY ( y) P( Φ( ) y) P ( Φ ( y) ) F ( Φ ( y) ) (54) dy dy du To get the desity fuctio (use chai rule: ): dx du dx f Y ( y) d FY ( y) d F ( Φ ( y) ) dy dy d d F ( Φ ( y) ) ( ) dy Φ y d Φ ( y) (55) f Y ( y) f [ Φ ( y) ][ ( Φ )'( y) ] (56) If Y a+ b, the

f Y ( y) ----- y b f a ---------- a y ai + b 0 otherwise (57) The proof is similar for the case whe Φ( ) is decreasig. There whe P( Φ( ) y) P ( Φ ( y) ) F ( Φ ( y) ). But whe we take the derivatives we will get (53). The reaso for takig the absolute value is because pdf is +ve. 3. Appedix II: Derivatio for Var[Sample mea] Derivatio of Var[ ] σ : Var[ + Y] Var[ ] + Var[ Y] if ad Y are idepedet. Therefore Var Var[ ] σ ad Var[ ] Var i. ( σ ) σ This from page 93 of [3]. Here is how I verified it. Let Z Y, where Y Usig (57), sice Z Y, Var[ Z] EZ [ ] ( EZ [ ]) z f Z ( z) dz ( EZ [ ]) (58) f Z ( z) f Y ( z). (59) Therefore (58) becomes: Var[ Z] ---- fz ( z) dz ( EZ [ ]) ----f Y ( z) dz ( EZ [ ]) y y ----f Y ( z) d ---- y ( EZ [ ]) (60) y Var[ Z] ---- ( y f Y ( y) dy) ( EZ [ ]) (6) y EZ [ ] zf Z ( z) dz -- f ( z ) d ---- y -- yf Y Y ( y) dy EY ----------- [ ] (6)

Var[ Z] ----E[ Y EY [ ] ] ----------- σ ----Var ( Y) -------- σ ----- (63) 3.3 Distributios 3.3. Normal distributio If N ( µσ, ), its pdf is: f( x) -------------- exp -- x ----------- µ < x < σ π σ (64) F Z ( z) z ---------- e y dy, where Z N( 0, ) (65) π x µ F ( x) F Z ----------- σ (66) 3.3. Gamma, chi-square ad t distributios Gamma pdf: GAM( λα, ), pg. 7/8 [3] f() t λ α t α e λt --------------------------- α > 0 t > 0. (67) Γα ( ) Γα ( ) x α e x dx, Γα ( ) ( α )Γ( α ) ad (68) 0 Γ( ) ( )! if is a positive iteger (69) The chi-square distributio is a special case of gamma distributio with α -- ad λ --, where is a positive iteger. Thus if GAM(, ), the it is said to have a chi-squared distributio with degrees of freedom χ. Studet-t pdf (oe parameter ): f T () t Γ ----------- + ----------------------- πγ + t ( + ) --- < t < -- (70)

Refereces [] K. S. Trivedi, Probability, Statistics with Reliability, Queueig ad Computer Sciece Applicatios, Secod Editio, Wiley, 00, ISBN 0-47-3334-7. [] R. Yates ad D. Goodma, Probability ad Stochastic Processes, Wiley, ISBN 0-47-7837-3. [3] K. S. Trivedi, Probability, Statistics with Reliability, Queueig ad Computer Sciece Applicatios, First Editio, Pretice Hall, 98, ISBN 0-3-7564-4r. [4] D. Bertsekas ad R. Gallager, Data Networks, Pretice Hall, 986, ISBN 0-3-9685-4. [5] Prof. Boorsty s otes, Polytechic Uiversity, NY. [6] A. Leo Garcia ad I. Widjaja, Commuicatio Networks, McGraw Hill, 000, First Editio. [7] Mischa Schwartz, Telecommuicatios Networks, Protocols, Modelig ad Aalysis, Addiso Wesley, 987. [8] Averill M. Law, W. David Kelto, Simulatio modelig ad aalysis, McGraw Hill, 000. [9] Output Aalysis of a Sigle System, IE 305 Simulatio, distributed /9/03 (posted o web site as PostNotes3.doc). [0] Prof. William Saders, UIUC otes (posted o web site). [] R. Fewster, Stats 0, http://www.stat.aucklad.ac.z/~u4750/otes.php, chapter 6.