STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Similar documents
STAT/MATH 395 PROBABILITY II

Quick review on Discrete Random Variables

Things to remember when learning probability distributions:

Discrete Distributions

Expectation, variance and moments

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

HW3 : Moment functions Solutions

15 Discrete Distributions

Northwestern University Department of Electrical Engineering and Computer Science

Moment Generating Functions

1.6 Families of Distributions

Chapter 4 : Discrete Random Variables

4 Moment generating functions

Lecture 2: Repetition of probability theory and statistics

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Exponential Distribution and Poisson Process

Distributions of Functions of Random Variables

1 Solution to Problem 2.1

Lecture 5: Moment generating functions

1 Review of Probability

Bivariate Distributions

PROBABILITY THEORY LECTURE 3

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Slides 8: Statistical Models in Simulation

Lecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution

Fundamental Tools - Probability Theory II

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Analysis of Engineering and Scientific Data. Semester

Mathematical Statistics 1 Math A 6330

Algorithms for Uncertainty Quantification

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Continuous Random Variables and Continuous Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions

1 Random Variable: Topics

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 5. Chapter 5 sections

Stat 512 Homework key 2

STAT 3610: Review of Probability Distributions

Chapter 2: Random Variables

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review for the previous lecture

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Lecture 1: August 28

1.1 Review of Probability Theory

2 Random Variable Generation

6.1 Moment Generating and Characteristic Functions

Discrete Distributions Chapter 6

Limiting Distributions

Topic 3: The Expectation of a Random Variable

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Chapter 7. Basic Probability Theory

Chap 2.1 : Random Variables

Chapter 4. Continuous Random Variables 4.1 PDF

Probability and Distributions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Continuous Distributions

1 Review of Probability and Distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Exam 3, Math Fall 2016 October 19, 2016

Stat410 Probability and Statistics II (F16)

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Probability Distributions Columns (a) through (d)

Random Variables and Their Distributions

Brief Review of Probability

FINAL EXAM: Monday 8-10am

Stat 475 Life Contingencies I. Chapter 2: Survival models

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Lecture 1: Review on Probability and Statistics

Econ 508B: Lecture 5

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet

PAS04 - Important discrete and continuous distributions

Random variables. DS GA 1002 Probability and Statistics for Data Science.

1 Presessional Probability

Discrete Distributions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

Statistics 3657 : Moment Generating Functions

Continuous Distributions

Limiting Distributions

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Chapter 2. Discrete Distributions

FINAL EXAM: 3:30-5:30pm

Topic 3: The Expectation of a Random Variable

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Lecture 3. Discrete Random Variables

MTH 202 : Probability and Statistics

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Transcription:

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it exists, is called the r-th moment of X. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then E[X r ] x r p X (x) () x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : The zero-th moment is. E[X r ] The first moment is the expected value of X x r f X (x) dx () Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[(X E[X]) r ], if it exists, is called the r-th moment about the mean or r-th central moment of X. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then E [(X E[X]) r ] (x E[X]) r p X (x) (3) x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : E [(X E[X]) r ] The zero-th central moment is. The first central moment is 0. The second central moment is the variance of X. (x E[X]) r f X (x) dx (4)

Moment generating functions Definition.. Let X be a rrv on probability space (Ω, A, P). For a given t R, the moment generating function (m.g.f.) of X, denoted M X (t), is defined as follows M X (t) E [ e tx] (5) where there is a positive number h such that the above summation exists for h < t < h. In particular, If X is a discrete rrv with pmf p X and X(Ω) is the set of all values that X can take on, then M X (t) e tx p X (x) (6) x X(Ω) If X is a continuous rrv with pdf f X, then Remarks : M X (t) The m.g.f. M X (0) always exists and is equal to. e tx f X (x) dx (7) Theorem.. A moment generating function completely determines the distribution of a real-valued random variable. Proposition.. Let X be a rrv on probability space (Ω, A, P). The r-th moment of X can be found by evaluating the r-th derivative of the m.g.f. of X at t 0. M (r) X (0) E[Xr ] (8) Proof. Using the expansion of the exponential function as a series, we have that : e tx (tx) n n! Hence, n0 M X (t) E[e tx ] [ ] (tx) n E n! n0 t n n! E [Xn ] n0 + t E [X] + t E [ X ] + t3 6 E [ X 3] +...

Differentiating M X r times, the first r terms vanish. We now have the following : d M X (t) dt r r(r )... E[X r ] + r! E[X r ] + k t k k! E [ X r+k] (r + )r... (r + )! t E [ X r+] + (r + )(r + )... 3 (r + )! t E [ X r+] +... It now becomes clear that evaluating d M X (t) dt r at t 0 gives the result. Lemma.. Let X be a rrv on probability space (Ω, A, P).. The expected value of X, if it exists, can be found by evaluating the first derivative of the moment generating function at t 0 : E[X] M X(0) (9). The variance of X, if it exists, can be found by evaluating the first and second derivatives of the moment generating function at t 0 : Var(X) M X(0) (M X(0)) (0) 3 Moment generating functions of Common Discrete Distributions 3. Discrete Uniform Distribution Definition 3. (Discrete Uniform Distribution). Let (Ω, A, P) be a probability space and let X be a random variable that can take n N values on X(Ω) {,..., n}. X is said to have a discrete uniform distribution U n if its probability mass function is given by : p X (i) P(X i), for i,..., n () n 3

Computation of the mgf. Let X be a random variable that follows a discrete uniform distribution U n. The mgf of X is given by : M X (t) E [ e tx] e tx p X (x) n n x (e t ) x x }{{} sum of a geometric sequence et ent e t et e (n+)t n( e t ) The above equality is true for all t 0. If t 0, we have : M X (0) n n x x e 0 x In a nutshell, the mgf of X is given by : { e t e (n+)t M X (t) n( e t ) if t 0 if t 0 3. Binomial Distribution Definition 3. (Bernoulli process). A Bernoulli or binomial process has the following features :. We repeat n N identical trials. A trial can result in only two possible outcomes, that is, a certain event E, called success, occurs with probability p, thus event E c, called failure, occurs with probability p 3. The probability of success p remains constant trial after trial. In this case, the process is said to be stationary. 4. The trials are mutually independent. 4

Definition 3.3 (Binomial Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If n N trials are performed according to a Bernoulli process, then the random variable X defined as the number of successes among the n trials, is said to have a binomial distribution Bin(n, p) and its probability mass function is given by : ( ) n p X (x) P(X x) p x ( p) n x for x 0,..., n () x Computation of the mgf. Let X be a random variable that follows a binomial distribution Bin(n, p). The mgf of X is given by : M X (t) e tx p X (x) x0 ( ) n e tx p x ( p) n x x x0 x0 ( ) n (pe t ) x ( p) n x x ( pe t + p ) n where the last equality comes from the binomial theorem. 3.3 Poisson Distribution Definition 3.4 (Poisson Distribution). Let (Ω, A, P) be a probability space. A random variable X is said to have a Poisson distribution P(λ), with λ > 0 if its probability mass function is given by : p X (x) P(X x) e λ λx, for x N (3) x! Computation of the mgf. Let X be a random variable that follows a Poisson distribution P(λ). The mgf of X is given by : M X (t) e tx p X (x) x0 x0 e tx λ λx e x! e λ (λe t ) x x! x0 e λ exp ( λe t) exp ( λ(e t ) ) 5

3.4 Geometric Distribution Definition 3.5 (Geometric Distribution). Let (Ω, A, P) be a probability space. Let E A be an event labeled as success, that occurs with probability p. If all the assumptions of a Bernoulli process are satisfied, except that the number of trials is not preset, then the random variable X defined as the number of trials until the first success is said to have a geometric distribution G(p) and its probability mass function is given by : p X (x) P(X x) ( p) x p, for x N (4) Computation of the mgf. Left as an exercise (Homework 3)! Distribution Probability Mass Function M X (t) E[X] Var(X) p X (x) P(X x) Uniform U n n n N Binomial Bin(n, p) n N, p (0, ) Poisson P(λ) λ > 0 Geometric G(p) ( ) n p x ( p) n x 0 x for x 0,..., n λ λx e x! for x N ( p) x p e t e (n+)t n( e t ) if t 0 if t 0 ( pe t + p ) n n + n np np( p) exp ( λ(e t ) ) λ λ pe t ( p)e t p p (0, ) for x N for t < ln( p) p p 6

4 Moment generating functions of Common Continuous Distributions 4. Continuous Uniform Distribution Definition 4.. A continuous r.r.v. is said to follow a uniform distribution U(a, b) on a segment [a, b], with a < b, if its pdf is f X (x) b a [a,b](x) (5) Computation of the mgf. Let X be a continuous random variable that follows a uniform distribution U(a, b). The mgf of X is given by : M X (t) b a b a e tx f X (x) dx b a e tx t etb e ta t(b a) e tx dx xb xa The above equality holds for t 0. We notice that M X (0). 4. Normal Distribution Definition 4.. A continuous random variable is said to follow a normal (or Gaussian) distribution N (µ, σ ) with parameters, mean µ and variance σ if its pdf f X is given by: { f X (x) σ π exp ( ) } x µ, for x R (6) σ 7

Computation of the mgf. We start by computing the mgf a standard normal random variable Z N (0, ). The mgf of Z is given by : M Z (t) e tx f Z (x) dx e tx π e x dx π e (x xt) dx e t / e t / π e (x t) } {{ } pdf of N (t,) We know (Property 4.5 in Chapter 5) that X µ + σ Z follows a normal distribution N (µ, σ ). Therefore the mgf of a normal distribution N (µ, σ ) is given by : t(µ+σ M µ+σ Z (t) E [e Z)] 4.3 Exponential Distribution e tµ E [ tσ e Z] e tµ M Z (tσ) e tµ e t σ / exp {µt + σ t } Definition 4.3. A continuous random variable is said to follow an exponential distribution E(λ) with λ > 0 if its pdf f X is given by: f X (x) λe λx R+ (x) (7) dx 8

Computation of the mgf. Let X be a continuous random variable that follows an exponential distribution E(λ). The mgf of X is given by : M X (t) 0 e tx f X (x) dx e tx λe λx dx λ e x(λ t) dx 0 λ e x(λ t) (λ t) x0 λ [0 ] (λ t) λ λ t When integrating the exponential, we must be aware that lim x e x(λ t) 0 if and only if λ t > 0. Therefore the derived formula holds if and only if t < λ. Distribution Probability Density Function M X (t) E[X] Var(X) Uniform U(a, b) a, b R with a < b Normal N (µ, σ ) f X (x) b a [a,b](x) { σ π exp µ R, σ > 0 for x R Exponential E(λ) λ > 0 λe λx R+ (x) ( ) } x µ σ e tb e ta t(b a) if t 0 if t 0 exp {µt + σ t } λ λ t for t < λ a + b (b a) µ σ λ λ 9