Independent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring

Similar documents
Systems Simulation Chapter 7: Random-Number Generation

How does the computer generate observations from various distributions specified after input analysis?

UNIT 5:Random number generation And Variation Generation

IE 303 Discrete-Event Simulation L E C T U R E 6 : R A N D O M N U M B E R G E N E R A T I O N

How does the computer generate observations from various distributions specified after input analysis?

CPSC 531: Random Numbers. Jonathan Hudson Department of Computer Science University of Calgary

Random Number Generators

B.N.Bandodkar College of Science, Thane. Random-Number Generation. Mrs M.J.Gholba

Slides 3: Random Numbers

Sources of randomness

2 Random Variable Generation

Random Number Generation. CS1538: Introduction to simulations

2008 Winton. Review of Statistical Terminology

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Review of Statistical Terminology

Random Variables and Their Distributions

Uniform random numbers generators

CIVL Continuous Distributions

IE 581 Introduction to Stochastic Simulation

Slides 8: Statistical Models in Simulation

H 2 : otherwise. that is simply the proportion of the sample points below level x. For any fixed point x the law of large numbers gives that

STAT Chapter 5 Continuous Distributions

Statistics 3657 : Moment Approximations

Uniform Random Number Generators

Joint Probability Distributions and Random Samples (Devore Chapter Five)

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Lecture 11. Probability Theory: an Overveiw

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( ) Chapter 4 4.

Experimental Design and Statistics - AGA47A

Probability Density Functions

Continuous Random Variables. and Probability Distributions. Continuous Random Variables and Probability Distributions ( ) ( )

Monte Carlo Simulation

functions Poisson distribution Normal distribution Arbitrary functions

Continuous random variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Probability Distributions Columns (a) through (d)

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

DUBLIN CITY UNIVERSITY

Chapter 4: Monte Carlo Methods. Paisan Nakmahachalasint

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

ISyE 6644 Fall 2014 Test 3 Solutions

3 Continuous Random Variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

STA 256: Statistics and Probability I

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Statistical Methods for Astronomy

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Pseudo-Random Numbers Generators. Anne GILLE-GENEST. March 1, Premia Introduction Definitions Good generators...

MAS1302 Computational Probability and Statistics

Chapter 3.3 Continuous distributions

Recitation 2: Probability

Stochastic Simulation of

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Limiting Distributions

B. Maddah ENMG 622 Simulation 11/11/08

SDS 321: Introduction to Probability and Statistics

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Department of Electrical- and Information Technology. ETS061 Lecture 3, Verification, Validation and Input

Continuous Probability Distributions. Uniform Distribution

p. 6-1 Continuous Random Variables p. 6-2

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Probability and Distributions

National Sun Yat-Sen University CSE Course: Information Theory. Maximum Entropy and Spectral Estimation

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Brief Review of Probability

Chapter 4: CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

General Principles in Random Variates Generation

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area

Simulation. Where real stuff starts

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

Simulation. Where real stuff starts

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Continuous Random Variables

Lecture 2: Repetition of probability theory and statistics

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

Modeling Uncertainty in the Earth Sciences Jef Caers Stanford University

Generating pseudo- random numbers

1 Probability and Random Variables

Semester , Example Exam 1

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018

Common ontinuous random variables

1.1 Review of Probability Theory

Simulation. Version 1.1 c 2010, 2009, 2001 José Fernando Oliveira Maria Antónia Carravilla FEUP

Chapter 5. Chapter 5 sections

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Lecture 2: CDF and EDF

Midterm Exam 1 Solution

Department of Statistical Science FIRST YEAR EXAM - SPRING 2017

Continuous Random Variables and Continuous Distributions

Recall the Basics of Hypothesis Testing

( x) ( ) F ( ) ( ) ( ) Prob( ) ( ) ( ) X x F x f s ds

Gamma and Normal Distribuions

Lecture 3 Continuous Random Variable

Transcription:

Independent Events Two events are independent if knowing that one occurs does not change the probability of the other occurring Conditional probability is denoted P(A B), which is defined to be: P(A and B) / P(B) P(A B) = probability of A occurring given that B occurs A independent of B if P(A) = P(A B) Independence is equivalent to P(A and B) = P(A) * P(B) if A independent of B, then B is independent of A In terms of a Venn diagram, P(A) = P(A B) says that the ratio of the area of (A and B) to the area of B is the same as the area of A A A and B B 22

Independence of Random Variables Two random variables X and Y are independent if knowing the realized value of one of them (e.g.,y = y), doesn t provide any information on the probability of the other (X) taking on any particular value; notation: P(X = x) = P(X = x Y = y) for all possible realized values of X and Y This is equivalent to saying that the events A and B are independent, where A = {outcomes for which X = x} B = {outcomes for which Y = y} For all possible realized values of X and Y 23

Independence of Random Variables Observe 250 realizations of three random variables, X, Y, and Z, i.e., {x i, y i, z i } for i = 1 to 250 The R 2 (R = correlation) between the x i values and the y i values is a measure of what percentage of the deviations of the y i s from E(Y) can be predicted by knowing the deviations of the corresponding x i s from E(X) R 2 for these realizations: R 2 0: Top figure shows essentially no correlation between X and Y, i.e., knowing X s realizations tells you nothing about Y s realizations we can reasonably believe that X and Y are independent* R 2 = 1: Bottom figure shows perfect correlation between Z and Y, i.e., knowing Y s realizations tells you everything about Z s realizations we can reasonably believe that Y and Z are not independent Y realizations 100.0 90.0 80.0 70.0 60.0 50.0 40.0 30.0 20.0 10.0 0.0 Zero Correlation Example R^2 close to 0% 0.0 20.0 40.0 60.0 80.0 100.0 X realizations * Realizations from independent random variables will have zero correlation, but note that zero correlation does not necessarily imply independence in general 24

Properties of Random Numbers Random numbers are (independent) realizations of a random variable U(0,1) with uniform probability distribution on the closed interval [0, 1] (i.e., including 0 and 1) Can think of it as a series of realizations from U i, i = 1, 2, 3, where the U i are independent, identically distributed (i.i.d.) uniform random variables f ( x) = # 1, 0 $ x $ 1 "! 0, otherwise E( R) 1 =! xdx 0 = 2 x 2 1 0 = 1 2 25

More General Uniform Probability Distribution A random variable U(a,b) is uniformly distributed on the interval (a,b) if its pdf and cdf are: $ 1!, f ( x) = # b & a!" 0, Properties P(x 1 < X < x 2 ) is proportional to the length of the interval, because F(x 2 ) F(x 1 ) = (x 2 -x 1 )/(b-a) E(X) = (a+b)/2 V(X) = (b-a) 2 /12 a % x % b otherwise $ 0,! x ' a F( x) = #,! b ' a " 1, f(x) x < a a & x < b x % b As described on the previous slide, U(0,1) is what we use to define random numbers 1 (b-a) a b 26

Review: Probability Density Functions and Cumulative Distributions Probability density functions on left Cumulative probability distributions on right Area under the left hand curve from 0 to a point x 0 = height of right hand curve at x 0 area = height Exponential (λ) x 0 x 0 For x 0 =4, area = (4-1)(0.2) = 0.6 Uniform (1, 6) 27

Generation of Pseudo-Random Numbers Pseudo, because generating numbers using a known method removes the potential for true randomness Goal: To produce a sequence of numbers falling into the closed interval [0,1] that simulates the random variable R, i.e., so that the realizations provide statistically significant evidence that the ideal properties of random numbers (RN) Important considerations in software that generates random variables Fast Portable to different computers Sufficiently long cycle Replicable Uniformity and independence One standard way to generate random numbers is using a linear congruential generator, or better yet a combination of them Random numbers are important in simulation Realizations of random variables are generated using random numbers These realizations are called random variates 28

Linear Congruential Random Number Generator Generate a sequence of integers Z 1, Z 2, Z 3, via the recursion Z i = (a Z i 1 + c) (mod m) for i = 0, 1, 2, m (at most) where a, c, and m are carefully chosen constants Specify a seed Z 0 to start off mod m means take the remainder after dividing (a Z i 1 + c) by m Because the Z i s are between 0 and m 1 Return the ith random number as U i = Z i / m Issues: Cycling (repeating values of the random numbers) Independence Uniformity The selection of the values for a, c, m, and Z 0 drastically affects the statistical properties and the cycle length Typical to overcome issues by combining several linear congruential generators 29

The Current (as of 2000) Arena Random Number Generator Uses some of the same ideas as LCG Modulo division, recursive on earlier values But is not an LCG Combines two separate component generators Recursion involves more than just the preceding value Combined multiple recursive generator (CMRG) A n = (1403580 A n-2 810728 A n-3 ) mod 4294967087 B n = (527612 B n-1 1370589 B n-3 ) mod 4294944443 Two simultaneous recursions Z n = (A n B n ) mod 4294967087 Combine the two Seed = a six-vector of first three A n s, B n s Cycle length = 3.1 10 57 U n = Z n / 4294967088 if Z n > 0 4294967087 / 4294967088 if Z n = 0 The next random number 30

Tests for Random Numbers There are standard tests for uniformity and independence When to use tests: If a well-known simulation languages or random-number generators is used, it is probably unnecessary to test If the generator is not explicitly known or documented tests should be applied to many sample numbers. Types of tests: Theoretical tests: evaluate the mathematics of the random number generator without actually generating any numbers Empirical tests: use statistical tests on actual sequences of numbers produced Test of uniformity Two different methods that check to see if the observed relative frequencies are close to the expected relative frequencies Kolmogorov-Smirnov test Chi-square test Tests for autocorrelation Methods that check to see if there is correlation among the set of random numbers 31

Generating Realizations of a Discrete Random Variable Using inverse-transform technique Suppose the random variable X has a discrete probability distribution given by: x P(X =x) F(x) 0 0.50 0.50 1 0.30 0.80 2 0.20 1.00 Given a random number, a realization of u = 0.73 of U(0,1), x, a realization of X, is given by: " 0, $ x = # 1, $ % 2, u & 0.5 0.5 < u & 0.8 0.8 < u &1.0 u 0.8 This is equivalent to: X = 0 X = 1 X = 2 x 0.73 0.0 0.5 0.8 1.0 32

Generating Exponentially Distributed Realizations X is an exponentially distributed random variable with λ = 1 Let u = P( X x) = F(x) = 1 e -λx for x 0 To generate realizations of X (random variates) U 1 Let x i = F -1 (u i ) = -(1/λ) ln(1-u i ) Where the u i are random numbers, i.e., realizations of U i (0, 1) U 2 We can use the inverse-transform technique for some continuous distributions U 1 33

Example Generation of Realizations Example: Generate 200 realizations from an exponentially distributed random variable with λ = 1 Examples of other distributions for which the inverse-transform technique works are: Uniform distribution Weibull distribution Triangular distribution All discrete distributions 34

Normal (Gaussian) Distribution A random variable X is normally distributed has the pdf: "! < µ <! Mean: Variance:! 2 > 0 Denoted as X ~ N(µ, σ) f (x) = - $ 1 " 2# e,- Special properties:. The pdf is symmetrical around the mean, i.e., f(µ - x) = f(µ + x) The maximum value of the pdf occurs at x = µ; i.e., the mean and mode are equal Unimodal, i.e., the pdf decreases as the distance from µ increases + 2 1 % x$µ (. ' * 0 2& " ) / 0, $ 1 < x < 1 1.0 F(x) 0.5 µ 35