Things to remember when learning probability distributions:

Similar documents
Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 5. Chapter 5 sections

Math/Stat 352 Lecture 8

1.1 Review of Probability Theory

Probability Distributions Columns (a) through (d)

STAT/MATH 395 PROBABILITY II

Mathematical Statistics 1 Math A 6330

Lecture 3. Discrete Random Variables

1 Random Variable: Topics

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Guidelines for Solving Probability Problems

SDS 321: Introduction to Probability and Statistics

Department of Mathematics

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

Department of Mathematics

15 Discrete Distributions

Discrete Distributions

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Stat 5101 Notes: Brand Name Distributions

Northwestern University Department of Electrical Engineering and Computer Science

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Continuous Distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

1 Bernoulli Distribution: Single Coin Flip

1 Review of Probability

Relationship between probability set function and random variable - 2 -

Random Variables and Their Distributions

Fundamental Tools - Probability Theory II

Special distributions

Topic 3: The Expectation of a Random Variable

2 Continuous Random Variables and their Distributions

Introduction to Probability Theory for Graduate Economics Fall 2008

Chapter 2: Random Variables

Binomial and Poisson Probability Distributions

Chapter 2. Discrete Distributions

3 Modeling Process Quality

Exponential Distribution and Poisson Process

4 Moment generating functions

MAS113 Introduction to Probability and Statistics

ECE 313 Probability with Engineering Applications Fall 2000

Common Discrete Distributions

Analysis of Engineering and Scientific Data. Semester

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Topic 9 Examples of Mass Functions and Densities

3 Continuous Random Variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Discrete Probability Distributions

Continuous Random Variables and Continuous Distributions

Discrete Random Variables

Common ontinuous random variables

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Chapter 3 Discrete Random Variables

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Probability and Statistics Concepts

STAT 3610: Review of Probability Distributions

Stat 100a, Introduction to Probability.

M378K In-Class Assignment #1

Conditional distributions (discrete case)

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Lecture 16. Lectures 1-15 Review

Chapter 5 continued. Chapter 5 sections

2 Random Variable Generation

Chapter 1. Sets and probability. 1.3 Probability space

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Stat 134 Fall 2011: Notes on generating functions

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Random Variables Example:

Lecture 17: The Exponential and Some Related Distributions

Lecture 1: August 28

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Chapter 3: Random Variables 1

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

Week 2. Review of Probability, Random Variables and Univariate Distributions

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Probability and Distributions

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

Midterm Exam 1 Solution

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

1 Presessional Probability

Sample Spaces, Random Variables

Statistics 1B. Statistics 1B 1 (1 1)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

PAS04 - Important discrete and continuous distributions

Continuous Random Variables

1 Review of Probability and Distributions

Continuous Distributions

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

SOLUTION FOR HOMEWORK 12, STAT 4351

Quick Tour of Basic Probability Theory and Linear Algebra

Transcription:

SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions We already saw and used exponential, Binomial and hypergeometric distributions We will now explore the definitions and properties of the other special distributions Things to remember when learning probability distributions: Mathematical Details 1 Is it continuous, or discrete? What s the sample space? 2 Density(mass) function? Parameter ranges? 3 Expected value formula? Variance? Application Context 1 Corresponding experiment? Cartoon example? 2 Any common applications? Why is it special or useful? 3 Relationships to other distributions? Bernoulli Sample space {0, 1}, mean p, variance p(1 p), and mass function p x = p x (1 p) 1 x Binomial The number of successes in n Bernoulli(p) trials Discrete random variable with sample space {0,, n}, and mass function (with parameters n, p) given by ( ) n p x = p x (1 p) n x k The mean is np and the variance is np(1 p) Multinomial (Generalized Binomial) Discrete random variable for the number of each of k types of outcomes in n trials Sample space {0,, n} k, and mass function (with parameters n,p 1,,p k where the p i = 1) given by p x1,,x k = n! x 1! x k! px 1 1 p x k k The marginals are binomial, thus the means are E(X i ) = np i and the variances are V ar(x i ) = np i (1 p i ) Hypergeometric Discrete rv with sample space {0,, w}, and mass function (with parameters N, w, n) given by ( w N w ) p x = x)( n x ( N n) The mean is n w/n and the variance is n w/n(1 w/n)(n n)/(n 1) 1

Generalized Hypergeometric Discrete random variable with parameters n, n 1,, n k, ni = N, with mass function p x1,,x k = ( n1 x k ) ) ( x 1 nk ( N n) The marginals are Hypergeometric Uniform (Continuous) Discrete Continuous random variable with sample space {}, and density function f x = (b a) 1 The mean is (b + a)/2 and the variance is (b a) 2 /12 Poisson distribution Discrete random variable with λ > 0, mass function P (X = k) = e λ (λ k ) for k = 0, 1, 2, The mean and variance are the same, namely E(X) = V ar(x) = λ Poisson Approximation to Binomial distribution Let X Bin(n, p) be a binomial random variable with number of trials n and probability of success p Then, for large n and small p, that is when n and p 0 in such a way that np = λ is held constant, we have lim P (X = k) = e np (np) k n,p 0 = e λ (λ) k For large values of n, small p, we can therefore approximate the Binomial distribution with a Poisson distribution: P (X = k) e np (np) k Poisson Model Suppose events can occur in space or time in such a way that: 1 The probability that two events occur in the same small area or time interval is zero 2 The events in disjoint areas or time intervals occur independently 3 The probability than an event occurs in a given area or time interval T depends only on the size of the area or length of the time interval, and not on their location Poisson Process Suppose that events satisfying the Poisson model occur at the rate λ per unit time Let X(t) denote the number of events occuring in time interval of length t Then P (X = k) = e λt (λt) k 2

X(t) is called Poisson process with rate λ Exponential Continuing from above, the waiting time Y between consecutive events has an exponential distribution with parameter λ (that is with mean 1/λ), that is P (Y > t) = e λt, t > 0, or equivalently, f(t) = λe λt, for t > 0 The mean is 1/λ and the variance is 1/λ 2 3

Geometric and Negative Binomial distributions Geometric experiment: Toss a fair coin until the first H appears Let X=number of tosses required for the first H Then X has geometric distribution with probability of success 05 Definition: Geometric distribution A random variable X has a geometric distribution with parameter p if its pmf is P (X = k) = (1 p) k 1 p, for k = 1, 2, 3, It is denoted X Geo(p) The mean and variance of a geometric distribution are EX = 1/p and V ar(x) = 1 p pe, respectively The mgf of X is M p 2 X (t) = t 1 (1 p)e t Memoryless property of geometric distribution Let X Geo(p), then for any n and k, we have P (X = n + k X > n) = P (X = k) Negative Binomial experiment Think of geometric experiment performed until we get r successes Let X = number of trials until we have r successes Definition: Negative Binomial distribution A random variable X has a negative binomial distribution with parameters r and p if its pmf is ( ) k 1 p X (k) = P (X = k) = (1 p) k r p r, for k = r, r + 1, r + 2, (1) r 1 A common notation for X is X NegBin(r, p) [ The mean ] and variance of X are EX = r/p r and V arx = r(1 p) pe The mgf of X is M p 2 X (t) = t 1 (1 p)e t Connection between negative binomial and geometric distributions If X 1, X 2,, X r are iid Geo(p), then r i=1 X i NegBin(r, p) 4

The Gamma distribution Definition The Gamma function For any positive real number r > 0, the gamma function of r is denoted Γ(r) and equal to Γ(r) = 0 y r 1 e y dy Theorem Properties of Gamma function The Gamma(r) function satifies the following properties: 1 Γ(1) = 1 2 Γ(r) = (r 1)Γ(r 1) 3 For r integer, we have Γ(r) = (r 1)! Definition of the Γ(r, λ) random variable For any real positive numbers r > 0 and λ > 0, a random variable with pdf f X (x) = λr Γ(r) xr 1 e λx, x > 0, is said to have a Gamma distribution with parameters r and λ, denoted X Γ(r, λ) Theorem: moments and mgf of a gamma distribution If X Γ(r, λ) then 1 EX= r/λ 2 Var(X)= r/λ 2 3 Mgf of X is M X (t) = (1 t/λ) r Theorem Let X 1, X 2,, X n be iid exponential random variables with parameter λ, that is with mean 1/λ The the sum of X i s has a gamma distribution with parameters n and λ More precisely, n i=1 X i Γ(r, λ) Theorem A sum of independent gamma random variables X Γ(r, λ) and Y Γ(s, λ) with the same λ has a gamma distribution with r = r + s and the same λ That is X + Y Γ(r + s, λ) Note: In a sequence of Poisson events occurring with rate λ per unit time/area, the waiting time for the r th event has a Γ(r, λ) distribution 5