NOTES ON DISTRIBUTIONS

Similar documents
EE 4TM4: Digital Communications II Probability Theory

Probability and statistics: basic terms

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

Econ 325: Introduction to Empirical Economics

4. Basic probability theory

CH.25 Discrete Random Variables

Binomial Distribution

Random Variables, Sampling and Estimation

Closed book and notes. No calculators. 60 minutes, but essentially unlimited time.

This section is optional.

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

Discrete Probability Functions

Introduction to probability Stochastic Process Queuing systems. TELE4642: Week2

Modeling and Performance Analysis with Discrete-Event Simulation

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

4. Partial Sums and the Central Limit Theorem

Limit Theorems. Convergence in Probability. Let X be the number of heads observed in n tosses. Then, E[X] = np and Var[X] = np(1-p).

Distribution of Random Samples & Limit theorems

Approximations and more PMFs and PDFs

AMS570 Lecture Notes #2

Handout #5. Discrete Random Variables and Probability Distributions

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

IE 230 Probability & Statistics in Engineering I. Closed book and notes. No calculators. 120 minutes.

Joint Probability Distributions and Random Samples. Jointly Distributed Random Variables. Chapter { }

Introduction to Probability and Statistics Twelfth Edition

Empirical Distributions

IE 230 Seat # Name < KEY > Please read these directions. Closed book and notes. 60 minutes.

Expectation and Variance of a random variable

Parameter, Statistic and Random Samples

Mathematical Statistics - MS

Lecture 7: Properties of Random Samples


Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Statisticians use the word population to refer the total number of (potential) observations under consideration

Lecture 2: Poisson Sta*s*cs Probability Density Func*ons Expecta*on and Variance Es*mators

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 6 9/24/2008 DISCRETE RANDOM VARIABLES AND THEIR EXPECTATIONS

FACULTY OF MATHEMATICAL STUDIES MATHEMATICS FOR PART I ENGINEERING. Lectures

Probability 2 - Notes 10. Lemma. If X is a random variable and g(x) 0 for all x in the support of f X, then P(g(X) 1) E[g(X)].

CH5. Discrete Probability Distributions

Basics of Probability Theory (for Theory of Computation courses)

Final Review for MATH 3510

Quick Review of Probability

An Introduction to Randomized Algorithms

STAT 516 Answers Homework 6 April 2, 2008 Solutions by Mark Daniel Ward PROBLEMS

PRACTICE PROBLEMS FOR THE FINAL

Lecture 4. Random variable and distribution of probability

Topic 9: Sampling Distributions of Estimators

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

The standard deviation of the mean

Quick Review of Probability

STAT Homework 1 - Solutions

Learning Theory: Lecture Notes

7.1 Convergence of sequences of random variables

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

Simulation. Two Rule For Inverting A Distribution Function

STAT 515 fa 2016 Lec Sampling distribution of the mean, part 2 (central limit theorem)

Topic 9: Sampling Distributions of Estimators

7.1 Convergence of sequences of random variables

The variance of a sum of independent variables is the sum of their variances, since covariances are zero. Therefore. V (xi )= n n 2 σ2 = σ2.

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

Unbiased Estimation. February 7-12, 2008

10/31/2018 CentralLimitTheorem

Discrete probability distributions

Lecture 5. Random variable and distribution of probability


NANYANG TECHNOLOGICAL UNIVERSITY SYLLABUS FOR ENTRANCE EXAMINATION FOR INTERNATIONAL STUDENTS AO-LEVEL MATHEMATICS

Rule of probability. Let A and B be two events (sets of elementary events). 11. If P (AB) = P (A)P (B), then A and B are independent.

Chapter 6 Principles of Data Reduction

Topic 9: Sampling Distributions of Estimators

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

CS/ECE 715 Spring 2004 Homework 5 (Due date: March 16)

Chapter 6 Sampling Distributions

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Introduction to Probability I: Expectations, Bayes Theorem, Gaussians, and the Poisson Distribution. 1

Probability and Statistics

PRACTICE PROBLEMS FOR THE FINAL

Reliability and Queueing

= p x (1 p) 1 x. Var (X) =p(1 p) M X (t) =1+p(e t 1).

Lecture 1 Probability and Statistics

Lecture 19: Convergence

Lecture 18: Sampling distributions

Discrete Random Variables and Probability Distributions. Random Variables. Discrete Models

Notation List. For Cambridge International Mathematics Qualifications. For use from 2020

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

6. Sufficient, Complete, and Ancillary Statistics

Last time: Moments of the Poisson distribution from its generating function. Example: Using telescope to measure intensity of an object

Lecture 1 Probability and Statistics

1 Introduction to reducing variance in Monte Carlo simulations

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Elements of Statistical Methods Lots of Data or Large Samples (Ch 8)

Queuing Theory. Basic properties, Markovian models, Networks of queues, General service time distributions, Finite source models, Multiserver queues

STATISTICAL METHODS FOR BUSINESS

c. Explain the basic Newsvendor model. Why is it useful for SC models? e. What additional research do you believe will be helpful in this area?

Kurskod: TAMS11 Provkod: TENB 21 March 2015, 14:00-18:00. English Version (no Swedish Version)

CS 330 Discussion - Probability

Lecture 3: August 31

Stat 319 Theory of Statistics (2) Exercises

Transcription:

NOTES ON DISTRIBUTIONS MICHAEL N KATEHAKIS Radom Variables Radom variables represet outcomes from radom pheomea They are specified by two objects The rage R of possible values ad the frequecy fx with which values from withi the rage ca occur Whe the rage is a discrete set we have a discrete radom variable ad whe the rage is cotiuous we have a cotiuous radom variable See several examples below For otatioal coveiece we ofte exted the rage to a larger set where outcomes outside the origial rage are assiged zero frequecy The terms: a radom variable X, the distributio fx, or the Rage R, is discrete respectively cotiuous are equivalet The cumulative frequecy or distributio fuctio, of a radom variable X, is defied as F x = { x fy if X is discrete, x fydy if X is cotiuous Expectatio ad momets The expected value or first momet of a radom variable X is a costat EX defied by { xfx if X is discrete, EX = xdf x = xfxdx if X is cotiuous R The th momet of a radom variable X is a costat EX defied by { EX = x df x = x fx if X is discrete, x fxdx if X is cotiuous R The variace of a radom variable X is a costat varx or σ X defied by V arx = E[X EX ] 3 Joit Distributios - Idepedet radom variables Ofte two radom variables X ad Y, have a joit distributio defied by a two dimesioal frequecy fuctio fx, y I the discrete case I the cotiuous case fx, y = P X = x, Y = y P X x, Y y = x x The margial frequecy, i the discrete case, is defied fx = P X = x = y= fx, y dx dy fx, y I the cotiuous case it is easier to defie the margial distributio:

MICHAEL N KATEHAKIS F x = P X x = y= fx, y The covariace of two joitly distributed radom variables X ad Y is a costat CovX, Y defied by CovX, Y = E[X EXY EY ] Two joitly distributed radom variables X ad Y are idepedet if: F x, y = F xf y for all x, y Properties expectatio ad variace EaX + b = aex + b, EX + Y = EX + EY, EXY = EXEY, for idepedet X ad Y, V arx = EX EX, V arax + b = a V arx, V arx + Y = V arx + V ary, for idepedet X ad Y, V arx + Y = V arx + V ary + CovX, Y, CovX, Y = EXY EXEY, CovX, X = varx Summary of EX ad V arx for various distributios for X X EX V arx Beroullip p p p Biomial, p p p p Geometricp p P oissoλ λ λ Uiform{,, } + a+b p p Uiforma, b Nµ, σ µ σ expλ λ λ b a 4 Discrete radom variables The Beroulli radom variable represets experimets trails with oly two possible outcomes - success or failure 0 We write X Beroulli f Rage fx 0 f0 f where P X = = f ad P X = 0 = f0 = f

NOTES ON DISTRIBUTIONS 3 The Biomial radom variable represets the umber of successes i idepedet Beroulli trials each with success probability p = f We write: X Biomial, p Rage fx 0 f0 = 0 f = f = f = f = p p p p p p p p Note that p is the probability of successes, p is the probability of failures, is the umber of outcomes with successes i Beroulli trials Applicatios: # of defectives i a sample # of successful requests # of days without a accidet, etc Relatio with Beroulli: Biomial, p = Beroullip, ie, a Beroulli rv is the # of success i a sigle Beroulli trial, if X, X,, X, are idepedet Beroullip, the Y = i= X i Biomial, p 3 The Geometric radom variable represets the umber of trials util you get the first success i idepedet Beroulli trials each with success probability p We write: X Gp Rage fx f = p f = f = p p p p Memoryless Property A importat property of the geometric distributio is: P X > s + t X > t = P X > s for all s, t = 0,, This property meas that if you repeat a experimet util the first success, the, give that the first success has ot yet occurred, the coditioal probability distributio of the umber of additioal trials does ot deped o how may failures have bee observed The die oe throws or the coi oe tosses

4 MICHAEL N KATEHAKIS does ot have a memory of these failures The geometric distributio is the oly memoryless discrete distributio Applicatios # of tests to locate a bug # waitig for a evet arrival of a customer i discrete time, etc Example Suppose a computer program has a error i it To locate the error, ivestigators coduct a series of idepedet tests Each test is based o radom iputs ad it detects the error with probability 030 a The the probability of detectig the error o the third test is 7 3 b The probability that it taes at least three tests to detect the error is P X 3 = P X = + P X = + P X = 3 = 3 + 7 3 + 7 3 4 The Poisso radom variable typically represets the umber of evets that occur i a fixed time iterval, whe two evets are uliely to occur simultaeously durig a short time period We write: X P oissoλ The parameter λ represets the rate, ie, the average umber of evets per uit of time or area Rage fx 0 f = e λ f = e λ λ f = e λ λ! Applicatios # of customer arrivals i a store durig a day, # of job arrivals durig a fixed period of time, # of accidets, # of telephoe calls i discrete time, etc Relatio with Biomial: Whe i a Biomial p is small p 005 ad is large geq0 ie, the successes are rare, ad the umber of trials i large, the P oissop = Biomial, p We use this Poisso approximatio for Biomial wheever possible as worig with the Poisso is easier 5 The Discrete uiform radom variable typically represets outcomes from a fiite set that each ca occur with equal probability Rage fx f = / f = / f = / Applicatios the demad realized i a fixed time period, whe throwig a fair dice the possible values are,, 3, 4, 5, 6, ad each time the dice is throw, the probability of a give score is /6, etc

NOTES ON DISTRIBUTIONS 5 5 Cotiuous radom variables 6 The Expoetial radom variable typically represets the radom time betwee evets that happe at a costat average rate We use the otatio X expλ Rage fx F x [0, ] λe λx λe λx The expoetial distributio may be viewed as a cotiuous couterpart of the geometric distributio, which describes the umber of Beroulli trials ecessary for the first success The expoetial distributio models the time util the first arrival of a Poisso processes Specifically, suppose that Nt =# of rare evets i [0, t] P oissoλt Let X = time of first evet or X = time of ext evet, or X = time betwee two evets The, X expλ Memoryless Property A importat property of the expoetial distributio is: P X > s + t X > t = P X > s for all s, t 0 This meas, for example, that the coditioal probability that we eed to wait, 5 more secods before the first arrival, give that the first arrival has ot yet happeed after 30 secods, is the same as the iitial probability that we eed to wait more tha 5 secods for the first arrival ie, P X > 35 X > 30 = P X > 5 The expoetial distributio is the oly memoryless cotiuous distributio Applicatios the iter-arrival times i queues, the time it taes to process a order at a fast-food restaurat, etc 7 The Uiform radom variable typically represets outcomes such that values i all itervals of the same legth are equally probable We use the otatio X Ua, b, where b > a The mai applicatio of uiform rvs is i the simulatio of radom variables Rage fx F x a, b /b a x a/b a Stadard Uiform distributio: X U0, 8 The Normal radom variable typically represets outcomes such that values i itervals close to the mea are more probable tha values i itervals of the same legth that are far from the mea It is ofte called the bell curve because the graph of its probability desity resembles a bell The stadard ormal distributio is the ormal distributio with a mea of zero ad a stadard deviatio of oe; its desity ad its distributio is deoted with φx ad P hix, respectively We use the otatio X Nµ, σ Rage fx F x, + σ x µ e σ π x fydy Some properties of the Normal distributio a P µ σ < X < µ + σ = 68, b P µ σ < X < µ + σ = 95, c P µ 3σ < X < µ + 3σ = 997, If X Nµ, σ, ad Y = a + bx the Y Na + bµ, b σ 3 If X Nµ, σ, ad Y = X µ/σ the Y N0,

6 MICHAEL N KATEHAKIS Refereces Averill M Law ad W David Kelto 999 Simulatio Modelig ad Aalysis 3rd ed McGraw-Hill Higher Educatio, New Yor, NY Mauel Lagua ad Joha Marlud 004 Busiess Process Modelig, Simulatio, ad Desig, Pretice Hall, Eglewood Cliffs, NJ Departmet of Maagemet Sciece ad Iformatio Systems, Rutgers Busiess School, Newar ad New Bruswic, 80 Uiversity Aveue, Newar, NJ 070-895 E-mail address: m@rcirutgersedu