Lecture 5: Moment generating functions
|
|
- Beverly Wendy Francis
- 5 years ago
- Views:
Transcription
1 Lecture 5: Moment generating functions Definition The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf provided that E(e tx ) exists. (Note that M X () = E(e X ) = 1 always exists.) Otherwise, we say that the mgf M X (t) does not exist at t. Theorem For any constants a and b, the mgf of the random variable ax + b is Proof. By definition, M ax+b (t) = e bt M X (at) M ax+b (t) = E(e t(ax+b) ) = E(e tax e bt ) = e bt E(e (ta)x ) = e bt M X (at) UW-Madison (Statistics) Stat 69 Lecture / 16
2 The main use of mgf It can be used to generate moments. It helps to characterize a distribution. Theorem If M X (t) exists at ±t, then E(X n ) exists for any positive integer n and E(X n ) = M (n) X () = d n dt n M X (t) t= i.e., the nth moment is the nth derivative of M X (t) evaluated at t =. Proof. Assuming that we can exchange the differentiation and integration (which will be justified later), d n dt n M X (t) = d n dt n e tx d n e tx f X (x)dx = dt n f X (x)dx = x n e tx f X (x)dx Hence d n dt n M X (t) = t= x n e x f X (x)dx = x n f X (x)dx = E(X n ) UW-Madison (Statistics) Stat 69 Lecture / 16
3 The condition that M X (t) exists at ±t is in fact used to ensure the validity of the exchange of differentiation and integration. Besides this, we also need to argue that E X n < for any positive integer n under the condition that M X (t) exists at ±t. We first show that, if M X (t) exists at ±t, then for any s ( t,t), M X (s) exists: M X (s) = E(e sx ) = E[e sx I(X > )] + E[e sx I(X )] E[e tx I(X > )] + E[e tx I(X )] E(e tx ) + E(e tx ) = M X (t) + M X ( t) < where I(X > ) is the indicator of X >. Next, we show that, for any positive p >, E X p < under the condition M X (t) exists at ±t. For a given p >, choose s such that < ps < t. Because s X e s X, we have s p X p e ps X e psx + e psx and E X p s p E(e psx + e psx ) = s p M X (ps) + M X ( ps) < UW-Madison (Statistics) Stat 69 Lecture / 16
4 In fact, the condition that M X (t) exists at ±t ensures that M X (s) has the power series expansion E(X M X (s) = k )s k t < s < t k! k= If the distribution of X is symmetric (about ), i.e., X and X have the same distribution, then M X (t) = E(e tx ) = E(e t( X) ) = E(e tx ) = M X ( t) i.e., M X (t) is an even function and M X (t) exists at ±t is the same as M X (t) exists at a t >. Example (Gamma mgf) Let X have the gamma pdf 1 f X (x) = Γ(α)β α x α 1 e x/β, x >, where α > and β > are two constants and Γ(α) = is the so-called gamma function. x α 1 e x dx α > UW-Madison (Statistics) Stat 69 Lecture / 16
5 If t < 1/β, M X (t) = E(e tx 1 ) = Γ(α)β α = 1 Γ(α)β α = ( β 1 βt )α Γ(α)β α x α 1 e x/( 1 βt ) dx e tx x α 1 e x/β dx β s α 1 e s ds = 1 (1 βt) α If t 1/β, then E(e tx ) =. We can obtain E(X) = d dt M X (t) αβ = t= (1 βt) α+1 = αβ t= For any integer n > 1, E(X n ) = d n dt n M X (t) α(α + 1) (α + n 1)β n = t= (1 βt) α+n = α(α + 1) (α + n 1)β n t= UW-Madison (Statistics) Stat 69 Lecture / 16
6 Can the moments determine a distribution? Can two random variables with different distributions have the same moments of any order? Example X 1 has pdf f 1 (x) = 1 2πx e (logx)2 /2, x X 2 has pdf f 2 (x) = f 1 (x)[1 + sin(2π logx)], x For any positive integer n, E(X1 n ) = 1 x n 1 e (logx)2 /2 dx 2π = 1 2π = en2 /2 2π = e n2 /2 e ny y 2 /2 dy e (y n)2 /2 dy y = logx using the property of a normal distribution UW-Madison (Statistics) Stat 69 Lecture / 16
7 E(X2 n ) = x n f 1 (x)[1 + sin(2π logx)]dx = E(X1 n ) + 1 x n 1 e (logx)2 /2 sin(2π logx)dx 2π = E(X n 1 ) + 1 2π = E(X n 1 ) + en2 /2 2π = E(X n 1 ) + en2 /2 2π = E(X n 1 ) + en2 /2 2π = E(X n 1 ) e ny e y 2 /2 sin(2πy)dy e (y n)2 /2 sin(2πy)dy e s2 /2 sin(2π(s + n))ds e s2 /2 sin(2πs)ds since e s2 /2 sin(2πs) is an odd function. This shows that X 1 and X 2 have the same moments of order n = 1, 2,..., but they have different distributions. UW-Madison (Statistics) Stat 69 Lecture / 16
8 In some cases, moments determine the distributions. The mgf, if it exists, determines a distribution. Theorem Let X and Y be random variables with cdfs F X and F Y, respectively. a. If X and Y are bounded, then F X (u) = F Y (u) for all u iff E(X r ) = E(Y r ) for all r = 1,2,... b. If mgf s exist in a neighborhood of and M X (t) = M Y (t) for all t, then F X (u) = F Y (u) for all u. The key idea of the proof can be explained as follows. Note that M X (t) = e tx f X (x)dx is the Laplace transformation of f X (x). From the uniqueness of the Laplace transformation, there is a one-to-one correspondence between the mgf and the pdf. We will give a proof of this result in Chapter 4 for the multivariate case, after we introduce the characteristic functions. UW-Madison (Statistics) Stat 69 Lecture / 16
9 From the power series result in the last lecture, if the mgf of X exists in a neighborhood of, then it has a power series expansion which is determined by moments E(X n ), n = 1,2,... Therefore, knowing the mgf and knowing moments of all order are the same, but this is under the condition that the mgf exists in a neighborhood of. Once we establish part (b), the proof of part (a) is easy: if X and Y are bounded, then their mgf s exist for all t and thus their cdf s are the same iff their moments are the same for any order. The condition that the mgf exists in a neighborhood of is important. There are random variables with finite moments of any order, but their mgf s do not exist. Example The pdf f X (x) = 1 2πx e (logx)2 /2, x is called the log-normal distribution or density, because if X has pdf f X, then logx has a normal pdf. UW-Madison (Statistics) Stat 69 Lecture / 16
10 In Example 2.3.1, we have shown that the log-normal distribution has finite moments of any order. For t >, e tx M X (t) = e (logx)2 /2 dx = 2πx because, when t >, When t <, M X (t) = lim x e tx 2πx e (logx)2 /2 = e tx e (logx)2 /2 1 dx e (logx)2 /2 dx = 1 2πx 2πx and, hence, M X (t) exists for all t <. How do we find a distribution for which all moments exist and the mgf does not exists for any t? Consider the pdf { fx (x)/2 x > f Y (x) = f X ( x)/2 x < UW-Madison (Statistics) Stat 69 Lecture / 16
11 For this pdf, E( Y n ) = x n f X (x) 2 dx + ( x) n f X ( x) 2 which has been derived for any n = 1,2,... On the other hand, E(e ty ) = e tx f X (x) 2 dx + dx = x n f X (x)dx = E(X n ) e tx f X ( x) dx 2 = e tx f X (x) 2 dx + e tx f X (x) 2 dx and we have shown that one of these integrals is (depending on whether t > or < ). Theorem. If a random variable X has finite moment a n = E(X n ) for any n = 1,2,..., and the series a n t n < with t > n! n= then the cdf of X is determined by a n, n = 1,2... UW-Madison (Statistics) Stat 69 Lecture / 16
12 Example. Suppose that a n = n! is the nth moment of a random variable X. Since a n t n = n! t n = 1 t < 1 1 t n= n= and this function is the mgf of Gamma(1,1) at t, we conclude that X Gamma(1,1). Suppose that the nth moment of a random variable Y is { n!/(n/2)! if n is even a n = if n is odd Then a n t n = n! n= n!(t 2 ) n/2 n!(n/2)! = t 2k k! = et2 t R n is even k= Later, we show that this is the mgf of N(, 2), hence X N(, 2). For a log-normal distributed random variable X discussed in the beginning of this lecture, E(X n ) = e n2 /2 and n= en2 /2 t n /n! = for any t > and, hence, the theorem is not applicable. UW-Madison (Statistics) Stat 69 Lecture / 16
13 In applications we often need to approximate a cdf by a sequence of cdf s. The next theorem gives a sufficient condition for the convergence of cdf s and moments of random variables in terms of the convergence of mgf s. Theorem Suppose that X 1,X 2,... is a sequence of random variables with mgf s M Xn (t), and lim M X n n (t) = M X (t) < for all t in a neighborhood of where M X (t) is the mgf of a random variable X. Then, for all x at which F X (x) is continuous, lim F X n n (x) = F X (x). Furthermore, for any p >, we have lim E X n p = E X p and lim E X n X p = n n UW-Madison (Statistics) Stat 69 Lecture / 16
14 Example (Poisson approximation) The cdf of the binomial pmf, ( ) n f X (x) = p x (1 p) n x, x x =,1,...,n, where n is a positive integer and < p < 1, may not be easy to calculate when n is very large. It is often approximated by the cdf of the Poisson pmf, f Y (x) = e λ λ x, x =,1,2,... x! where λ > is a constant. For this purpose, we first compute the mgf s for the binomial and Poisson distributions. Example (binomial mgf) Using the binomial formula n x= ( n x ) u x v n x = (u + v) n UW-Madison (Statistics) Stat 69 Lecture / 16
15 we obtain n M X (t) = E(e tx ) = x=e tx( n x = n x= ( ) n (pe t ) x (1 p) n x x = (pe t + 1 p) n ) p x (1 p) n x Note that E(X) = d dt M X (t) = n(pe t + 1 p) n 1 pe t = np t= t= E(X 2 ) = d 2 dt 2 M X (t) t= = [n(n 1)(pe t + 1 p) n 1 (pe t ) 2 + n(pe t + 1 p) n 1 pe t ] = n(n 1)p 2 + np t= We got the same results previously, but the calculation here is simpler. UW-Madison (Statistics) Stat 69 Lecture / 16
16 Example (continued) If Y has the Poisson pmf, then M Y (t) = x= e tx e λ λ x x! = e λ (e t λ) x x! x= = e λ e et λ = e λ(et 1) which is finite for any t R. Let X n have the binomial distribution with n and p. Suppose that lim n np = λ > (that means p also depends on n and p when n ). Then, for any t, as n, [ ] n M Xn (t) = (pe t + 1 p) n = 1 + (np)(et 1) e λ(et 1) = M Y (t) n using the fact that, for any sequence of numbers a n converges to a, ( lim 1 + a ) n n = e a n n With this result and Theorem , we can approximate P(X n u) by P(Y u) when n is large and np is approximately a constant. UW-Madison (Statistics) Stat 69 Lecture / 16
E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More information4 Moment generating functions
4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf
More informationSOLUTION FOR HOMEWORK 12, STAT 4351
SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationLecture 3: Random variables, distributions, and transformations
Lecture 3: Random variables, distributions, and transformations Definition 1.4.1. A random variable X is a function from S into a subset of R such that for any Borel set B R {X B} = {ω S : X(ω) B} is an
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationSummary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?
Biostatistics 62 - Statistical Inference Lecture 5 Hyun Min Kang 1 What is an ancillary statistic for θ? 2 Can an ancillary statistic be a sufficient statistic? 3 What are the location parameter and the
More informationLecture The Sample Mean and the Sample Variance Under Assumption of Normality
Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationProbability Distributions
Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationWill Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:
Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More information18.175: Lecture 8 Weak laws and moment-generating/characteristic functions
18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach
More informationLecture 4. Continuous Random Variables and Transformations of Random Variables
Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationMath 341: Probability Eighteenth Lecture (11/12/09)
Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationA Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.
A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More information1 Expectation of a continuously distributed random variable
OCTOBER 3, 204 LECTURE 9 EXPECTATION OF A CONTINUOUSLY DISTRIBUTED RANDOM VARIABLE, DISTRIBUTION FUNCTION AND CHANGE-OF-VARIABLE TECHNIQUES Expectation of a continuously distributed random variable Recall
More informationGenerating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function
Generating and characteristic functions Generating and Characteristic Functions September 3, 03 Probability generating function Moment generating function Power series expansion Characteristic function
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationSTAT 512 sp 2018 Summary Sheet
STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationUses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).
1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need
More informationSTA 4321/5325 Solution to Homework 5 March 3, 2017
STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationMath Spring Practice for the final Exam.
Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationTransformations and Expectations
Transformations and Expectations 1 Distributions of Functions of a Random Variable If is a random variable with cdf F (x), then any function of, say g(), is also a random variable. Sine Y = g() is a function
More informationMath 341: Probability Seventeenth Lecture (11/10/09)
Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationReview for the previous lecture
Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationChapter 3 Common Families of Distributions
Lecture 9 on BST 631: Statistical Theory I Kui Zhang, 9/3/8 and 9/5/8 Review for the previous lecture Definition: Several commonly used discrete distributions, including discrete uniform, hypergeometric,
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Exercise 4.1 Let X be a random variable with p(x)
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More information5.6 The Normal Distributions
STAT 41 Lecture Notes 13 5.6 The Normal Distributions Definition 5.6.1. A (continuous) random variable X has a normal distribution with mean µ R and variance < R if the p.d.f. of X is f(x µ, ) ( π ) 1/
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationTHE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA
THE ROYAL STATISTICAL SOCIETY 4 EXAINATIONS SOLUTIONS GRADUATE DIPLOA PAPER I STATISTICAL THEORY & ETHODS The Societ provides these solutions to assist candidates preparing for the examinations in future
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationCharacteristic Functions and the Central Limit Theorem
Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in
More informationPartial Solutions for h4/2014s: Sampling Distributions
27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationHW3 : Moment functions Solutions
STAT/MATH 395 A - PROBABILITY II UW Spring Qurter 6 Néhémy Lim HW3 : Moment functions Solutions Problem. Let X be rel-vlued rndom vrible on probbility spce (Ω, A, P) with moment generting function M X.
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationSome Special Distributions (Hogg Chapter Three)
Some Special Distributions Hogg Chapter Three STAT 45-: Mathematical Statistics I Fall Semester 3 Contents Binomial & Related Distributions The Binomial Distribution Some advice on calculating n 3 k Properties
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationSome Special Distributions (Hogg Chapter Three)
Some Special Distributions Hogg Chapter Three STAT 45-: Mathematical Statistics I Fall Semester 5 Contents Binomial & Related Distributions. The Binomial Distribution............... Some advice on calculating
More information