CS 237: Probability in Computing

Similar documents
CS 237: Probability in Computing

The exponential distribution and the Poisson process

ECE 313 Probability with Engineering Applications Fall 2000

Continuous-time Markov Chains

Random variables. DS GA 1002 Probability and Statistics for Data Science.

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

O.K. But what if the chicken didn t have access to a teleporter.

S n = x + X 1 + X X n.

Slides 8: Statistical Models in Simulation

Common ontinuous random variables

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

DS-GA 1002 Lecture notes 2 Fall Random variables

IE 303 Discrete-Event Simulation

Northwestern University Department of Electrical Engineering and Computer Science

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Chapter 4 Continuous Random Variables and Probability Distributions

2 Continuous Random Variables and their Distributions

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

2007 Winton. Empirical Distributions

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

MAS1302 Computational Probability and Statistics

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Properties of Continuous Probability Distributions The graph of a continuous probability distribution is a curve. Probability is represented by area

Continuous-Valued Probability Review

Birth-Death Processes

Expected Values, Exponential and Gamma Distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Practice Problems Section Problems

SDS 321: Introduction to Probability and Statistics

EE126: Probability and Random Processes

Exponential Distribution and Poisson Process

LECTURE #6 BIRTH-DEATH PROCESS

Section 20: Arrow Diagrams on the Integers

Lecture 4: Bernoulli Process

Continuous Probability Spaces

Lecture 17: The Exponential and Some Related Distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

INF FALL NATURAL LANGUAGE PROCESSING. Jan Tore Lønning

Sample Spaces, Random Variables

2. This problem is very easy if you remember the exponential cdf; i.e., 0, y 0 F Y (y) = = ln(0.5) = ln = ln 2 = φ 0.5 = β ln 2.

Continuous Random Variables and Continuous Distributions

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Expected Values, Exponential and Gamma Distributions

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Disjointness and Additivity

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

CIVL Continuous Distributions

Chapter 5. Chapter 5 sections

STAT 3610: Review of Probability Distributions

Chapter 1 Review of Equations and Inequalities

L06. Chapter 6: Continuous Probability Distributions

Lecture 8 : The Geometric Distribution

Discrete Random Variables

Things to remember when learning probability distributions:

Generating Function Notes , Fall 2005, Prof. Peter Shor

Computational Genomics

Introduction to Applied Bayesian Modeling. ICPSR Day 4

PoissonprocessandderivationofBellmanequations

STA Module 4 Probability Concepts. Rev.F08 1

Single Maths B: Introduction to Probability

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

From Model to Log Likelihood

Lecture 15 and 16: BCH Codes: Error Correction

Lecture 10: Normal RV. Lisa Yan July 18, 2018

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

SDS 321: Introduction to Probability and Statistics

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Chapter 5 Random vectors, Joint distributions. Lectures 18-23

Basic concepts of probability theory

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Part I Stochastic variables and Markov chains

Guidelines for Solving Probability Problems

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Continuous random variables

Assignment 3 with Reference Solutions

Modelling data networks stochastic processes and Markov chains

Chapter 15. Probability Rules! Copyright 2012, 2008, 2005 Pearson Education, Inc.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Probability Methods in Civil Engineering Prof. Dr. Rajib Maity Department of Civil Engineering Indian Institute of Technology, Kharagpur

Introduction to Probability

Lecture 20 Random Samples 0/ 13

continuous random variables

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

STAT 380 Markov Chains

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Lecture 3 Continuous Random Variable

Modelling data networks stochastic processes and Markov chains

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

RVs and their probability distributions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Extra Topic: DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

Discrete Random Variables (cont.) Discrete Distributions the Geometric pmf

15-388/688 - Practical Data Science: Basic probability. J. Zico Kolter Carnegie Mellon University Spring 2018

Transcription:

CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 13: Normal Distribution Exponential Distribution

Recall that the Normal Distribution is given by an explicit formula, using the mean and variance/standard deviation as parameters: where! = mean/expected value " = standard deviation " 2 = variance 2

The normal distribution, as the limit of B(N,0.5), occurs when a very large number of factors add together to create some random phenomenon. Example: Even REALLY IMPORTANT things are normally distributed!

Recall that the only way we can analyze probabilities in the continuous case is with the CDF: P(X < a) = F(a) P(X > a) = 1.0 F(a) P(a < X < b) = F(b) F(a) 4

5

6

7

8

9

10

11

Standard Normal Distribution Since there are a potentially infinite number of Normal Distributions, sometimes we calculate using a normalized version, the Standard Normal Distribution with mean 0 and standard deviation (and variance) 1: Any random variable X which has a normal distribution N(μ,σ 2 ) can be converted into a N(0, 1) standardized random variable X* with distribution N(0,1) by the usual form. In the case of the normal distribution this is labelled as Z: This is usually helpful in HAND calculations, since μ and σ have been factored out... 12

13

14

15

16

If you were doing these calculations in 1900, or haven t heard of a computer, or if you were taking a test where you didn t have a calculator, here is how you would calculate the probability of a normallydistributed random variable: 17

Modern people use the appropriate formulae: 18

Or a calculator or a web site: 19

Recall: Poisson Process Formally, we have the following definition: suppose we have discrete events occurring through time as just described, and let such that 1) The expected value of N[s..t] is proportional to the length (t s) of the interval; in particular, for any two non-overlapping intervals of the same length, the mean number of occurrences in each is the same; 2) The number of arrivals in two non-overlapping intervals is independent; and 3) The probability of two events occurring at the same time is 0. Then this random process is said to be a Poisson Process. We shall be dealing only with discrete intervals of time for now, and so the important things to remember are that the intervals are independent and the mean number of arrivals in each time unit is the same over its infinite range. 0 1 2 3 4 5

Recall: Poisson Random Variables Suppose we have a Poisson Process and we fix the unit time interval we consider (say, 1 second or 1 year, etc.), where the mean number of arrivals in a unit interval is!, and then each time we poke the random variable X we return N[0..1], N[1..2], N[2..3], etc. Then we call X a Poisson Random Variable with rate parameter!, denoted where

Interarrival Times of a Poisson Process Suppose we have a Poisson Process, and instead of counting the number of arrivals in each unit interval, we look at the interarrival times, i.e., the amount of time between each arrival. Intuitively, this is a natural thing to think about: How long before the next event? Y 0 1 2 3 4 5 Let s define the random variable Y = the arrival time of the first event. In fact, because the arrivals are independent, at any time t, probabilistically the Poisson process starts all over again (the events don t remember the past!), so in fact: Y = the interarrival time between any two events Now the question is: What is the distribution of Y?

Interarrival Times of a Poisson Process Y 0 1 2 3 4 5 What is the distribution of Y? Since and the number of arrivals in an interval is proportional to its length, that is, E( N[0..2] ) = 2 * E( N[0..1] ), etc., then and so the probability that there are n arrivals by time t is and

Interarrival Times of a Poisson Process Y 0 1 2 3 4 5 What is the distribution of Y? Recall the chain rule: Now, this is the formula of a CDF, that is, and so if we take a derivative, we get the PDF:

Exponential Distribution This is called the Exponential Distribution, and along with the Normal, is one of the most important continuous distributions in probability and statistics. Formally, then, if the random variable Y = the interarrival time between events in a Poisson Process we say that Y is distributed according to the Exponential Distribution with rate parameter!, denoted if and where and

Exponential Distribution: The Memoryless Property m n+m n The exponential, like the geometric, has the memoryless property, n and the proof is the same!