STAT 3610: Review of Probability Distributions

Similar documents
Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Northwestern University Department of Electrical Engineering and Computer Science

Continuous Random Variables

p. 6-1 Continuous Random Variables p. 6-2

1 Solution to Problem 2.1

Lecture 5: Moment generating functions

Continuous Distributions

2 Continuous Random Variables and their Distributions

1.1 Review of Probability Theory

Probability and Distributions

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Continuous Probability Spaces

Discrete Distributions

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Exponential Distribution and Poisson Process

Continuous Probability Distributions. Uniform Distribution

Things to remember when learning probability distributions:

Lecture 1: August 28

Contents 1. Contents

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 2. Discrete Distributions

SDS 321: Introduction to Probability and Statistics

Continuous Distributions

4 Moment generating functions

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

1 Expectation of a continuously distributed random variable

Random Variables and Their Distributions

Experimental Design and Statistics - AGA47A

Probability Distributions Columns (a) through (d)

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

1.6 Families of Distributions

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Slides 8: Statistical Models in Simulation

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Continuous Probability Distributions. Uniform Distribution

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Math Spring Practice for the Second Exam.

MATH Notebook 5 Fall 2018/2019

Introduction to Probability Theory for Graduate Economics Fall 2008

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

Lecture 2: Repetition of probability theory and statistics

Statistics for Economists. Lectures 3 & 4

Mathematical Statistics 1 Math A 6330

15 Discrete Distributions

3 Modeling Process Quality

1 Review of Probability and Distributions

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Limiting Distributions

Review for the previous lecture

Actuarial Science Exam 1/P

Stat410 Probability and Statistics II (F16)

Sampling Distributions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

3 Continuous Random Variables

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Continuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2

STAT509: Continuous Random Variable

Chapter 5. Chapter 5 sections

STAT/MATH 395 PROBABILITY II

STAT 430/510: Lecture 15

Test Problems for Probability Theory ,

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

SOLUTION FOR HOMEWORK 12, STAT 4351

Basics of Stochastic Modeling: Part II

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

THE QUEEN S UNIVERSITY OF BELFAST

Discrete random variables and probability distributions

Math Review Sheet, Fall 2008

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Lecture 3 Continuous Random Variable

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Introduction to Statistics. By: Ewa Paszek

Algorithms for Uncertainty Quantification

General Random Variables

Week 2. Review of Probability, Random Variables and Univariate Distributions

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

1 Joint and marginal distributions

Continuous Random Variables and Continuous Distributions

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Random Variables Example:

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Chapter 4: Continuous Probability Distributions

0 otherwise. Page 100 Exercise 9: Suppose that a random variable X has a discrete distribution with the following p.m.f.: { c. 2 x. 0 otherwise.

Fundamental Tools - Probability Theory II

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Transcription:

STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015

Support of a Random Variable Definition The support of a random variable, say X, denoted as X, is defined to be the set of all points on the real line for which the pdf/pmf is non-zero. That is, where the braces indicates a set. X = {x R : f X (x) > 0},

Support of a Random Variable The support of a random variable is usually denoted by the script form of the letter corresponding to random variable the random variable X has support X the random variable Y has support Y the random variable Z has support Z The support of a random variable is one of the first characteristics we can use to help identify the distribution of a random variable.

Continuous versus Discrete Random Variables Definition Continuous Random Variable: A random variable is said to be continuous if its support is a continuous set (made up of unions and intersections of real intervals). The CDF* of a continuous random variable must be a continuous function on the real line Definition Discrete Random Variable: A random variable is said to be discrete if its support is a discrete set. The CDF* of a discrete random variable is not continuous, but it is right continuous. *CDF stands for Cumulative Distribution Function

Difference between Discrete and Continuous Random Variables So, whether the random variable, X, is a continuous or discrete random variable depends on whether its support is continuous or discrete. In the discrete case, the pmf* is a discontinuous function with a positive mass (probability) at each point in the support. In the continuous case, the pdf** itself does not have to be a continuous function everywhere, but it usually a continuous function on intervals in the support. *pmf stands for probability mass function **pdf stands for probability density function

Exponential Random Variable and Exponential Distribution Example 1: (continuous support) Suppose X exponential (θ). From Section 3.2 (pp. 95-113) of the textbook, the exponential distribution, indexed by the scale parameter θ (θ > 0) is f (x; θ) = 1 θ e x/θ I [0, ) (x) = { 1 θ e x/θ x 0 0 otherwise. which means {x R : f (x) > 0} = [0, ) and the support of X is X = [0, ), a continuous set. We see that the pdf for the exponential is zero for all points below zero, then jumps to λe 0 = λ at x = 0 and is continuous on [0, ).

pdf and cdf for the Standard Exponential 1 f (x) 0.5 F (a) = 1 e a 1 a x 1 F (x) 0.5 F (a) = 1 e a 1 a x

Exponential is special case of Gamma and Weibull You can verify that the non-truncated gamma and Weibull distributions, from which the exponential is a special case, share this same support. If X is a normal random variable, then the support is X = (, ) = R.

Mean or Expected Value of a Random Variable Recall, for any random variable, X, with pdf/pmf f (x), a measure of central tendency of the population is the population mean µ, or the expected value/long run average for X. More formally, Population Mean: For any random variable, X, with pdf/pmf f (x), the population mean µ = E(X ) where xf (x)dx if X is continuous µ = E(X ) = xf (x) if X is discrete x X

Population Variance or Variance of a Random Variable Population Variance: For any random variable, X, with pdf/pmf f (x), the population variance is σ 2 = E(X µ) 2, where (x µ) 2 f (x)dx if X is continuous σ 2 = E(X µ) 2 = (x µ) 2 f (x) if X is discrete x X

Sometimes Easier Way to Compute Population Variance Note that it is often easier to compute the variance by noting that, σ 2 = E(X µ) 2 = E(X 2 2X µ+µ 2 ) = EX 2 2µEX +µ 2 = EX 2 (EX ) 2. So, rather than going through the original express, one need only compute E(X 2 ) and µ = E(X ) and plug the results in to the following expression σ 2 = E(X 2 ) µ 2.

Expectations of Functions of a Random Variable You might notice that each of E(X ), E(X 2 ), and E(X µ) 2 are the expected value of different functions, g 1 (x) = x, g 2 (x) = X 2 and g 3 (x) = (x µ) 2. The expected value for any function is defined below.

Moment Generating Functions Whenever it exists, the moment-generating function for a random variable X, denoted M X (t), is the continuous function of t (, ) given as [ ] M X (t) = E e tx, t ( h, h), h > 0. The interval ( h, h) is referred to as the radius of convergence.

Properties of a Moment Generating Function (mgf) This function is called the moment-generating function because you can find the n th moment for the random variable, X, by computed its n th derivative with respect to t then setting t = 0, as follows E(X n ) = M (n) X (0) = d dt M X (t). t=0 Notice that the moment generating function is a continuous and differentiable function of t < h, whether or not X is continuous. In fact, the moment generating function is mathematically independent of the original variable (since it was integrated or summed over the support) and only relates to the variable X through the moments of the distribution and any related parameters.

Properties of Exponential We will show on chalkboard that if X Exp(θ) then f (x)dx = µ = E(X ) = 0 σ 2 = E(X µ) 2 = 1 θ e x/θ dx = 1. x f (x)dx = 0 0 x θ e x/θ dx = λ. (x µ) 2 f (x)dx = θ 2 Cumulative Distribution Function (cdf) for any w 0 is F (w) = P(X w) = 1 e w/θ The moment generating function (mgf), denoted M(t) exists and 1 M(t) = (1 θt), t < 1 θ

Discrete Example (Binomial) Example: (discrete support) Suppose Y is a Binomial random variable with parameters n and p (see page 117 of textbook),then the probability mass function (pmf) is f Y (y) = ( n y ) p y (1 p) n y y = 0, 1,..., n 0 otherwise which means {y : f (y) > 0} = {0, 1,..., n} and the support of Y is Y = {0, 1,..., n}, a discrete (and finite) set of points. Recall that, ( n y ) = n! y!(n y)!.

cdf for Binomial Random Variable Example 2 (binomial): Suppose X is a Binomial (4, 0.5), then the pmf is ( ) 4 1 f (x; n = 4, p = 1/2) =, x X = {0, 1, 2, 3, 4} x 2n Table : CDF for Binomial (n=4, p=1/2) (, 0) P(X < 0) = 0 0 = 0 [0, 1) P(X 0) = P(0) 1 = 1 [1, 2) P(X 1) = P(0) + P(1) 1 + 4 [2, 3) P(X 2) = P(0) + P(1) + P(2) 1 + 4 + 6 [3, 4) P(X 3) = P(0) + P(1) + P(2) + P(3) 1 + 4 + 6 + 4 [4, ) P(X 4) = P(0) + P(1) + P(2) + P(3) + P(4) 1 + 4 + 6 + 4 + 1 = 5 = 11 = 15 = 1

cdf for Binomial Random Variable 15/ 1 11/ 5/ 1/ 0 1 2 3 4 5

Binomial, hypergeometric, geometric One can verify that the supports for the negative binomial (r, p), the Geometric(p), and the Poisson random variables (see page 126) are the same countably infinite set {0, 1, 2,...}. The support of the hypergeomtric(n, M, N) is discrete/finite set {max(0, n N + M),..., min(n, M).}

Poisson Random Variable Suppose X is a discrete random variable with a Poisson (λ) distribution, then the probability mass function (pmf) is λ x e λ x = 0, 1, 2,..., f (x) = x! 0 otherwise where λ > 0. Recall that the MacLaurin series of an Exponential function, e y, is e y y i =, where i! represents the factorial function for an i! i=0 integer i.

Some Properties of a Poisson Random Variable λ x e λ f (x; λ) = = 1 x! x X x=0 µ = E(X ) = x f (x) = λ x X σ 2 = E(X µ) 2 = x X(x µ) 2 f (x) = λ M(t) = E(e tx ) = x X e tx f (x) = e λ(et 1), < t <.