Moment Generating Functions
|
|
- Lucas Dalton
- 5 years ago
- Views:
Transcription
1 MATH 382 Moment Generating Functions Dr. Neal, WKU Definition. Let X be a random variable. The moment generating function (mgf) of X is the function M X : R R given by M X (t ) = E[e X t ], defined for all t for which this expected value is finite. When X is a discrete random variable, then M X (t ) is obtained by M X (t ) = e k t P( X = k ), and when X is a continuous random variable, then M X (t ) is obtained by M X (t ) = e x t f X ( x) dx = e x t f X (x ) dx. We shall derive the mgf s of our standard random variables, derive properties of the mgf function, and then show how to use mgf s to derive the distribution of various sums of independent random variables. The Binomial MGF n n Let X ~ b(n, p). By the binomial expansion theorem, k a k b n k = (a + b) n ; thus, k = 0 M X (t ) = e k t n P( X = k ) = e k t n p k k q n k k = 0 n n = ( pe k t ) k q n k = ( pe t + q) n, for all t. k = 0 A special case is the Bernoulli random variable X ~ b(, p) for which M X (t ) = pe t + q. The Geometric MGF Let X ~ geo( p) with p > 0. Then 0 q < and because x k = x / ( x) for < x <, k = we have M X (t ) = e k t P( X = k ) = e k t q k p = p (qe t ) k k = q k = = p q q et qe t = p et qe t, for qet < ; i. e., for t < ln( / q).
2 The Poisson MGF Let X ~ Poi(λ ). Because xk k = 0 k! = e x for all x, we have M X (t ) = e k t P( X = k ) = e k t λ k e λ k = 0 k! = e λ (λ et ) k k = 0 k! = e λ e λ et = e λ (et ), for all t. The Uniform MGF Let X ~ U[a, b]. Then M X (t ) = e x t b f X (t ) = e x t b a dx = a = ebt e at t( b a), for t 0. b a t ex t b a The Exponential MGF Let X ~ exp(θ ) with θ > 0. We shall assume that θ t < 0 so that the improper integral below will converge. We then have M X (t ) = e x t f X (t ) = e x t θ e x /θ dx 0 = θ 0 e x (t /θ) dx = θ 0 = θ θ θ t ex (θ t )/θ 0 Let X ~ N(µ, σ ). Then f X (x ) = e x (θ t )/θ dx = 0 The Normal MGF θ t = θ t, for t < / θ. σ 2π e (x µ )2 / (2σ 2 ). Now let Y ~ N(µ + σ, σ ) with f Y (x ) = σ 2π e ( x (µ+σ )) 2 / (2σ 2 ) = σ 2π e ( x2 2x(µ+σ )+(µ +σ ) 2 ) /(2σ 2 ) = σ 2π e ( x2 2x(µ+σ ) + µ 2 +2µσ +σ 4 t 2 ) / (2σ 2 ). = σ 2π e ( x2 2x(µ+σ ) + µ 2 ) /(2σ 2 ) e µ t e σ 2 /2.
3 Then because f Y ( x) dx =, we have M X (t ) = e x t f X (t ) = e x t e (x µ )2 / (2σ 2 ) dx σ 2π = e x t ( x2 2 x µ + µ 2 ) /(2σ 2 ) dx = e (x 2 2 x µ 2σ 2 x t + µ 2 ) /(2σ 2 ) dx = e (x 2 2 x (µ +σ )+ µ 2 )/ (2σ 2 ) dx = e µ t e σ 2 /2 e ( x2 2 x (µ +σ ) + µ 2 ) /(2σ 2 ) e µ t e σ 2 /2 dx = e µ t e σ 2 /2 f Y ( x) dx = e µ t e σ 2 /2. Dr. Neal, WKU Thus, M X (t ) = e µ t e σ 2 /2 for all t. A special case is the standard normal distribution Z ~ N(0, ). M Z (t) = e t 2 /2 for all t. We then have Generating Moments Let X be a random variable. The first moment of X is simply the expected value E[ X ]. In general for j, the j th moment of X is E[ X j ], which can be found by E[ X j ] = x j P( X = k) E[ X j ] = x j f X (x )dx for X discrete for X continuous. But the mgf of X also can be used to obtain these moments. When X is discrete, then M X (t ) = e k t P( X = k ), and then the j th derivative of M X (t ) with respect to t is ( j) M X (t) = k j e k t P( X = k ). ( j) Then M X (0) = k j P( X = k) = E[ X j ].
4 Similarly, for X continuous, M X (t ) = M X (t ) with respect to t is M X ( j) (t) = M X ( j) (0) = e x t f X ( x) dx. Then the j th derivative of x j e x t f X ( x) dx, and thus x j f X ( x) dx = E[X j ]. Thus, the j th moment of X can be found by computing the j th derivative of M X (t ) and then evaluating it at t = 0. In particular, E[ X ] = M ʹ X (0) E[ X 2 ] = M X ʹ (0) Var( X) = M ʹ X ʹ (0) ( M ʹ X (0)) 2 Example. Let X ~ b(n, p) with M X (t ) = ( pe t + q) n, for all t. Then the first and second derivatives of M X (t ) are M X ʹ (t ) = n( pe t + q) n pe t and M X ʹ (t ) = n(n )( pe t + q) n 2 ( pe t ) 2 + n( pe t + q) n pe t. Thus, E[ X ] = and E[ X 2 ] = M ʹ X (0) = n( p + q) n p = n p M X ʹ (0) = n(n )p 2 + np Thus, Var( X) = n(n ) p 2 + np (np) 2 = np np 2 = np( p) = n pq. Properties of MGF s (i) M X +Y (t ) = M X (t ) M Y (t ) for independent random variables X and Y. (ii) M ax (t) = M X (a t ) for all constants a (iii) M ( X+c ) (t) = e c t M X ( t) for all constants c Proof. (i) Let X and Y be independent random variables. Then M X +Y (t ) = E[e (X+Y)t ] = E[e Xt+Yt ] = E[e Xt e Yt ] = By Indep. E[eXt ] E[e Yt ] = M X (t) M Y (t). (ii) For all constants a, we have M ax (t) = E[e (ax)t ] = E[e X(at ) ] = M X (a t ). (iii) For all constants c, we have M ( X+c ) (t) = E[e (X +c)t ] = E[e Xt e ct ] = e c t E[e Xt ] = e c t M X ( t ).
5 The following result, stated without proof, is perhaps the most important property of mgf s, and can used to determine the distribution of independent sums of known distributions. Theorem. The moment generating function completely determines the distribution of a random variable. That is, if two random variables have the same mgf, then they have the same distribution. Example 2. Let X ~ N(µ, σ ) and let Y = X µ. We have seen previously that Y is now σ a N(0, ) distribution. We can prove this fact by using the properties of mgf's and the preceding theorem. The mgf of X is M X (t ) = e µ t e σ 2 /2. Thus the mgf of Y = σ X µ σ is given by M Y (t ) = M σ X µ (t ) = e ( µ /σ)t M σ σ X (t) = e ( µ /σ)t M X (t / σ ) = e ( µ /σ)t e µ t/σ e σ2 (t 2 /σ 2 )/2 = e t 2 /2. which is the mgf of the N(0, ) distribution. Thus, Y itself must be N(0, ) because it has the same mgf as N(0, ). Example 3. (Sum of Independent Binomial Distributions) Let X ~ b(n, p) and Y ~ b(m, p), with X and Y being independent. Derive the distribution of X + Y. Solution. We know M X (t ) = ( pe t + q) n and M Y (t ) = ( pe t + q) m for all t. Then by independence, the mgf of X + Y is M X +Y (t ) = M X (t ) M Y (t ) = ( pe t + q) n ( pe t + q) m = ( pe t + q) n+m, which is the mgf of a b(n + m, p) distribution. Thus, X + Y ~ b(n + m, p) because it has the same mgf as b(n + m, p). Note: For both X and Y, the probability of success on any attempt is p. X counts the number of successes in n independent attempts, and Y counts the number of successes in m further independent attempts. So X + Y counts the total number of successes in n + m independent attempts; hence, X + Y ~ b(n + m, p).
6 Example 4. (Sum of Independent Normal Distributions) Let X ~ N(µ,σ ) and Y ~ N(µ 2,σ 2 ), with X and Y being independent. Derive the distribution of X + Y. Solution. We know M X (t ) = e µ t e σ 2 /2 and MY (t ) = e µ e σ 2 2 /2 for all t. Then by independence, the mgf of X + Y is M X +Y (t ) = M X (t ) M Y (t ) = e µ t e σ 2 /2 e µ e σ 2 2 /2 = e (µ +µ 2 )t e (σ 2 +σ 2 2 )t 2 /2, which is the mgf of a Normal distribution having mean µ = µ + µ 2 and variance σ 2 = σ 2 + σ2 2. Because the mgf completely determines the distribution, we must have X + Y ~ N(µ, σ ) = N µ + µ 2, σ 2 + σ2 2 ( ). Distribution of the Sample Mean of Independent Sample Measurements from a N(µ, σ ) Distribution Let x = x x n be the sample mean where x n i ~ N (µ,σ ) is the i th independent sample of a Normally distributed measurement. Then each x i has the same mgf given by M i (t) = e µ t e σ 2 /2. By independence, the mgf of the sum S = x x n is the product of the mgf's: M S (t ) = e µ t e σ 2 /2 n = e nµ t e nσ 2 /2. Then the mgf of x is M x (t) = M n S(t ) = M S (t / n) = enµ t/n e nσ 2 (t/n) 2 /2 = e µ t e (σ 2 /n) t 2 /2, which is the mgf of a Normal distribution having mean µ and variance σ 2 / n. Because the mgf completely determines the distribution, we must have x ~ N µ, σ. n Note: When sampling with a normally distributed measurement, the sample mean x is still normally distributed regardless of the sample size. And we have the following: µ x = µ The average of all possible sample means from samples of size n equals the original average µ. σ x = σ n The standard deviation of all possible sample means from samples of size n is a fraction of the original standard deviation σ. Important Statistical Consequence: As the sample size n increases, then σ x decreases to 0. Thus with large sample sizes, the values of x have very small deviation and therefore the possible values of x are consistently close to their average µ. Thus, with a large sample size, virtually any sample mean x will be a decent approximation of µ.
7 Exercises. Use the moment generating function to derive the mean and variance of X ~ exp(θ ). 2. Let X ~ Poi(λ ) and Y ~ Poi (β ), with X and Y being independent. Derive the distribution of X + Y. 3. The negative binomial distribution X ~ nb(r, p) is the sum of r independent geometric distributions: X = X + X X r, where each X i ~ geo( p). (a) Use the mgf's of the X i to derive the mgf of X ~ nb(r, p). (b) Use the mgf to derive the expected value of X ~ nb(r, p).
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationReview for the previous lecture
Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationLimiting Distributions
We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2014 1 See last slide for copyright information. 1 / 37 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationChapter 3 Common Families of Distributions
Lecture 9 on BST 631: Statistical Theory I Kui Zhang, 9/3/8 and 9/5/8 Review for the previous lecture Definition: Several commonly used discrete distributions, including discrete uniform, hypergeometric,
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More information3 Modeling Process Quality
3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationContinuous Random Variables
MATH 38 Continuous Random Variables Dr. Neal, WKU Throughout, let Ω be a sample space with a defined probability measure P. Definition. A continuous random variable is a real-valued function X defined
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationThe Geometric Random Walk: More Applications to Gambling
MATH 540 The Geometric Random Walk: More Applications to Gambling Dr. Neal, Spring 2008 We now shall study properties of a random walk process with only upward or downward steps that is stopped after the
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationUses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).
1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need
More informationEstimation Theory. as Θ = (Θ 1,Θ 2,...,Θ m ) T. An estimator
Estimation Theory Estimation theory deals with finding numerical values of interesting parameters from given set of data. We start with formulating a family of models that could describe how the data were
More information4 Moment generating functions
4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationFebruary 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM
February 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM Abstract. The Rao-Blacwell theorem told us how to improve an estimator. We will discuss conditions on when the Rao-Blacwellization of an estimator
More informationExample 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3).
Example 1. Assume that X follows the normal distribution N(2, 2 2 ). Estimate the probabilities: (a) P (X 3); (b) P (X 1); (c) P (1 X 3). First of all, we note that µ = 2 and σ = 2. (a) Since X 3 is equivalent
More informationProving the central limit theorem
SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationBalls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random
Balls & Bins Put m balls into n bins uniformly at random Seoul National University 1 2 3 n Balls into Bins Name: Chong kwon Kim Same (or similar) problems Birthday paradox Hash table Coupon collection
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More informationSampling Random Variables
Sampling Random Variables Introduction Sampling a random variable X means generating a domain value x X in such a way that the probability of generating x is in accordance with p(x) (respectively, f(x)),
More informationContinuous RVs. 1. Suppose a random variable X has the following probability density function: π, zero otherwise. f ( x ) = sin x, 0 < x < 2
STAT 4 Exam I Continuous RVs Fall 7 Practice. Suppose a random variable X has the following probability density function: f ( x ) = sin x, < x < π, zero otherwise. a) Find P ( X < 4 π ). b) Find µ = E
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions A
Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationThe Geometric Distribution
MATH 382 The Geometric Distribution Dr. Neal, WKU Suppose we have a fixed probability p of having a success on any single attempt, where p > 0. We continue to make independent attempts until we succeed.
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationChapter 8.8.1: A factorization theorem
LECTURE 14 Chapter 8.8.1: A factorization theorem The characterization of a sufficient statistic in terms of the conditional distribution of the data given the statistic can be difficult to work with.
More informationLecture 4: Sampling, Tail Inequalities
Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /
More informationChapter 5. Means and Variances
1 Chapter 5 Means and Variances Our discussion of probability has taken us from a simple classical view of counting successes relative to total outcomes and has brought us to the idea of a probability
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4
More informationECO227: Term Test 2 (Solutions and Marking Procedure)
ECO7: Term Test (Solutions and Marking Procedure) January 6, 9 Question 1 Random variables X and have the joint pdf f X, (x, y) e x y, x > and y > Determine whether or not X and are independent. [1 marks]
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationGeometric Skew-Normal Distribution
Debasis Kundu Arun Kumar Chair Professor Department of Mathematics & Statistics Indian Institute of Technology Kanpur Part of this work is going to appear in Sankhya, Ser. B. April 11, 2014 Outline 1 Motivation
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationExpectation, variance and moments
Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationPower Series. Part 2 Differentiation & Integration; Multiplication of Power Series. J. Gonzalez-Zugasti, University of Massachusetts - Lowell
Power Series Part 2 Differentiation & Integration; Multiplication of Power Series 1 Theorem 1 If a n x n converges absolutely for x < R, then a n f x n converges absolutely for any continuous function
More informationBinomial and Poisson Probability Distributions
Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationPage Max. Possible Points Total 100
Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationLecture 5: Moment Generating Functions
Lecture 5: Moment Generating Functions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 28th, 2018 Rasmussen (CUED) Lecture 5: Moment
More informationdiscrete random variable: probability mass function continuous random variable: probability density function
CHAPTER 1 DISTRIBUTION THEORY 1 Basic Concepts Random Variables discrete random variable: probability mass function continuous random variable: probability density function CHAPTER 1 DISTRIBUTION THEORY
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationCHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY
CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY CHAPTER 1 DISTRIBUTION THEORY 2 Basic Concepts CHAPTER 1 DISTRIBUTION THEORY 3 Random Variables (R.V.) discrete random variable: probability
More informationFINAL EXAM: Monday 8-10am
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.
More information6. Bernoulli Trials and the Poisson Process
1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14. The Poisson Process > 1 2 3 4 5 6 7 6. Bernoulli Trials and the Poisson Process Basic Comparison In some sense, the Poisson process is a continuous time
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationPAS04 - Important discrete and continuous distributions
PAS04 - Important discrete and continuous distributions Jan Březina Technical University of Liberec 30. října 2014 Bernoulli trials Experiment with two possible outcomes: yes/no questions throwing coin
More informationDennis Bricker Dept of Mechanical & Industrial Engineering The University of Iowa
Dennis Bricker Dept of Mechanical & Industrial Engineering The University of Iowa dennis-bricker@uiowa.edu Probability Theory Results page 1 D.Bricker, U. of Iowa, 2002 Probability of simultaneous occurrence
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationLecture 2. (See Exercise 7.22, 7.23, 7.24 in Casella & Berger)
8 HENRIK HULT Lecture 2 3. Some common distributions in classical and Bayesian statistics 3.1. Conjugate prior distributions. In the Bayesian setting it is important to compute posterior distributions.
More informationProperties of Random Variables
Properties of Random Variables 1 Definitions A discrete random variable is defined by a probability distribution that lists each possible outcome and the probability of obtaining that outcome If the random
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationSuppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.
Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until
More informationProbability Distributions
Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)
More informationMAS113 Introduction to Probability and Statistics. Proofs of theorems
MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationFundamental Tools - Probability Theory IV
Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent
More informationExample continued. Math 425 Intro to Probability Lecture 37. Example continued. Example
continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationi=1 k i=1 g i (Y )] = k
Math 483 EXAM 2 covers 2.4, 2.5, 2.7, 2.8, 3.1, 3.2, 3.3, 3.4, 3.8, 3.9, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 4.9, 5.1, 5.2, and 5.3. The exam is on Thursday, Oct. 13. You are allowed THREE SHEETS OF NOTES and
More informationMath 341: Probability Eighteenth Lecture (11/12/09)
Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationSampling Distributions
Sampling Distributions Mathematics 47: Lecture 9 Dan Sloughter Furman University March 16, 2006 Dan Sloughter (Furman University) Sampling Distributions March 16, 2006 1 / 10 Definition We call the probability
More information