Expectation, variance and moments
|
|
- Ashlie Sharleen James
- 5 years ago
- Views:
Transcription
1 Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks 7 In this lecture, we revisit the notion of the expectation and variance of a random variable, with particular reference to continuous random variables. We also describe how higher order moments of both continuous and discrete random variables can be computed via their moment generating functions. Expectation and variance We remember from our study of discrete random variables that the expectation of a discrete random variable makes precise the notion of the average value of the random variable, while the variance measures the expected spread around the average. However, we cannot compute the expectation of a continuous random variable X using the formula E[X] = X(ω)P[{ω}] = xp[x = x], ω Ω x R X because the summation involved is not defined: the random variable X(ω) takes on more than countably many values. Moreover, as P[X = x] = for all x R for a continuous random variable X, this definition, if taken seriously, always gives a value of zero! So how should one seek to define the expectation of a continuous random variable?
2 Suppose that the continuous random variable X is supported on (a, b), and has a probability density function f. Let s try to estimate E[X] by cutting [a, b] into n equal subintervals, each of width h, so h = (b a)/n. Define x j = a + jh, j =,,..., n to be the junction points between the subintervals. Then, the probability of X assuming a value on (x j, x j+ ) is xj+ ( ) xj+ + x j p j = P[x j < X < x j+ ] = f(x) dx (x j+ x j )f. x j Since the average value of the random variable on the subinterval (x j, x j+ ) is (x j + x j+ )/, an estimate of the desired expectation is approximately n j= x j + x j+ f ( xj + x j+ ) (x j+ x j ). As n gets ever larger, we see that this converges to the integral a b x=a xf(x) dx. This yields the required definition. Similarly, as the variance is defined as E[X ] E[X], we have ( b b Var[X] = x f(x) dx xf(x) dx). Definition. The expectation of the continuous random variable X with probability density f is E[X] = a xf(x) dx, provided that the integral converges absolutely (i.e., x f(x) dx < ; the variance of X is ( Var[X] = E[X ] E[X] = x f(x) dx xf(x) dx). Examples Example. Find the expectation and variance of the Gaussian random variable X with distribution N (µ, σ ). Solution: By definition, X has density function The expectation therefore is given by E[X] = f(x) = e ( x µ σ ), x R. xf(x) dx = xe ( x µ σ ) dx. Probability I(MS7) c John Appleby,
3 3 Making the change of variables y = (x µ)/σ, we get E[X] = = π (σy + µ)e y σ dy ye y / dy }{{} =e y / y== +µ The variance is similarly computed: firstly, note that π e y / dy } {{ } = = µ. Var[X] = E[X ] E[X] = E[(X E[X]) ] = E[(X µ) ], so, by writing this as an integral and making the change of variable y = (x µ)/σ again, we get Var[X] = = (x µ) e ( x µ σ Since d dy (e y / ) = ye y /, we have ) dx (σy) e y σ dy = σ y e y dy. π Var[X] = σ π Integrating the last integral by parts gives Var[X] = σ π = σ π y ye y d y dy (e ) dy. y= } {{ } = e y } {{ } = dy = σ. e y dy Exercise.. If X is a random variable with the E(λ) distribution (i.e., an exponential distribution with parameter λ), show that E[X] = λ. Hint: Compute xλe λx dx using integration by parts. In Tutorials, we introduce another continuous distribution: the Cauchy distribution. A Cauchy distribution function is one which has associated probability density f(x) = π + x, x R. Probability I(MS7) c John Appleby,
4 4 Here we mention an important property of the Cauchy distribution, which also highlights an important facet of the definition of the expectation of a continuous random variable: namely, that the expectation can fail to exist. This happens when x f(x) dx =. To see that this is the case for the Cauchy distribution, as the density is an even function, we have x f(x) dx = = ( x) π π + x dx + x + x dx. Integration by substitution (u = + x ) yields x f(x) dx = x π + x dx π u du = π log u =. u= 3 Moments and the moment generating function We have already seen how to calculate the mean (E[X]), and (in calculating the variance), the mean squared, (E[X ]) of a continuous random variable X. These two statistics give an indication of the location and spread of a distribution. It transpires that one can also determine whether a distribution is skewed to the left or right of its mean µ by calculating its skewness E[(X µ) 3 ], and that one can determine whether the distribution is strongly centred around its mean by calculating its kurtosis, E[(X µ) 4 ]. In each case, one is required to compute the expected value of the random variable raised to an integer power. Such a list of values for a random variable are called its moments. Definition. The moment of order n of a random variable X is E[X n ]. Observe that the definition holds equally well for discrete random variables. We saw above that it was relatively straightforward (but required quite a lot of hard work!) to compute the mean and variance of a Gaussian random variable. Now suppose that it was requested that we calculate all the moments of a Gaussian random variable, or for a random variable that was Poisson distributed! Such a request seems quite a difficult one to respond to. Fortunately, there is a neat gimmick called the moment generating function which streamlines, and makes systematic, such calculations. The following demonstration is not rigorous, and is for motivation only. Let s consider a random variable X for which all moments E[X n ] exist. Recall for any real values t and x, that we can write e tx as a power series, according to e tx = n! (tx)n. Probability I(MS7) c John Appleby,
5 5 Now, replace x by the random variable X to get e tx = n! tn X n. Now take expectations both sides, and use the linearity of the expectation, to get [ E[e tx ] = E n! tn X n] = E n! E[Xn ]t n. (3.) Notice that we have created or generated all the moments E[X n ] on the right-hand side of (3.) from the single function M X (t) = E[e tx ]. To obtain the moments E[X n ], we just need to remember the following result from the theory of power/taylor series: if M is a function which has arbitrarily many derivatives at zero, then M(t) = n! M (n) ()t n, (3.) where M (n) (t) is the n-th derivative evaluated at t. Comparing coefficients of t n in (3.) and (3.), we see that E[X n ] = dn dt n M X(t). t= This explains why the function M X (t) = E[e tx ] is called the moment generating function of the random variable X. Definition 3. The moment generating function of a random variable X (the m.g.f. of X) is the function M X (t) = E[e tx ] defined for those values of t for which E[e tx ] <. From the above discussion, the following result does not seem implausible. Proposition. If the moment generating function M X (t) exists in some interval containing, then X has finite moments of all orders, and E[X n ] = dn dt n M X(t). t= 4 Examples of moment generating functions We compute some of the moment generating functions of the random variables studied in this course, and then use them to evaluate the desired moments. Example. Let X be an exponentially distributed random variable with parameter λ. Compute its moment generating function, and use it to find the mean and variance of X. Probability I(MS7) c John Appleby,
6 6 Solution: Recalling that the density function of an E(λ) distribution is f(x) = λe λx for x, and f(x) = otherwise, we have M X (t) = E[e tx ] = e tx f(x) dx = e tx λe λx dx = λ The last integral is well-defined for t < λ: hence ) M X (t) = λ e (λ t)x ( (λ t) = λ x= (λ t) Now, we obtain so Similarly, so M X(t) = λ. (λ t) = λ (λ t), E[X] = M X() = λ. M X(t) = λ (λ t) 3, E[X ] = M X() = λ. = λ λ t. e (λ t)x dx. The variance of X is therefore Var[X] = E[X ] E[X] = λ λ = λ. Example 3. Consider the Poisson random variable X with probability mass function given by P[X = n] = e λ λ n, n =,,.... n! Compute the moment generating function of X, and thereby evaluate E[X], Var[X]. Solution: M X (t) = E[e tx ] = e tn λn e λ n! = e λ (λe t ) n. n! By noticing that the last summation is just exp(λe t ), we obtain M X (t) = e λ(et ). To compute the mean and variance, we use Proposition as before: by the chain rule, we get M X(t) = e λ(et ).λe t = λe t M X (t). Notice that M X () =. Then E[X] = M X () = λe M X () = λ. To calculate the second derivative of M X, we use the product rule to obtain Putting t =, we get M X(t) = M X(t)λe t + M X (t)λe t. E[X ] = M X() = M X()λe + M X ()λe ) = λ + λ, which gives Var[X] = E[X ] E[X] = λ + λ λ = λ. Probability I(MS7) c John Appleby,
7 7 Example 4. Obtain the moment generating function of the random variable X which has a N (µ, σ ) distribution. Solution: Recalling that the density function of a N (µ, σ ) distribution is f(x) = e ( x µ σ ), x R, we have M X (t) = E[e tx ] = = e tx e ( x µ σ ) dx e σ [(x µ) σ tx ] dx. (4.) We now attempt to complete the square in x of the term inside square brackets in the exponent in (4.): (x µ) σ tx = x µx + µ σ tx = x x(µ + σ t) + µ = ( x (µ + σ t) ) (µ + σ t) + µ = (x µ σ t) µσ t σ 4 t. Replacing this into the expression in (4.) yields M X (t) = e σ [ (x µ σ t) µσ t σ 4 t ] dx = e σ ( µσ t σ 4 t ) e σ (x µ σ t) ) dx = e µt+ σ t σ π e σ (x µ σ t) ) dx = e µt+ σ t. }{{} = Exercise 4.. Use the above to show that E[X] = µ, E[X ] = µ + σ. 5 Concluding remarks We close with some remarks concerning moment generating functions. The moment generating function of a random variable characterises its distribution: that is, if two random variables have the same m.g.f, they must have the same distribution function. For example, if a random variable X has moment generating function M X (t) = e µt+ σ t, it must be a Gaussian distributed random variable with distribution N (µ, σ ). Moment generating functions are most useful for random variables which are either bounded below (i.e., taking values on (a, )), or bounded above (i.e., Probability I(MS7) c John Appleby,
8 8 taking values on (, b)). For instance, if X is positive M X (t) = E[e tx ] = e tx f(x) dx exists for all t, and M X ( t) is the Laplace transform of the probability density f. If the random variable X is unbounded (that is, it can assume values on (, )), it can happen that its m.g.f does not exist at t. Instead one can make use of the characteristic function C X (t) = E[e itx ]. The characteristic function always exists, and has similar properties to m.g.fs: i n E[X n ] = dn dt n C X(t). t= If X has a density f, C X (t) is closely related to the Fourier transform of f: C X (t) = e itx f(x) dx. For example, a Cauchy-distributed random variable has no moment generating function, but it has characteristic function C(t) = π e itx + x dx = e t. Very often, the characteristic function of continuous random variables can be computed by contour integral and residue theory techniques from Complex Analysis. Probability I(MS7) c John Appleby,
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More information4 Moment generating functions
4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationExpectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom
Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to compute and interpret expectation, variance, and standard
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More information18.175: Lecture 8 Weak laws and moment-generating/characteristic functions
18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More information8 Laws of large numbers
8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable
More informationMATH2715: Statistical Methods
MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable
More information6.1 Moment Generating and Characteristic Functions
Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationAssignment 3 with Reference Solutions
Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationReview 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015
Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationChapter 4. Continuous Random Variables 4.1 PDF
Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationMath 341: Probability Seventeenth Lecture (11/10/09)
Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationContinuous Random Variables and Continuous Distributions
Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationConditional densities, mass functions, and expectations
Conditional densities, mass functions, and expectations Jason Swanson April 22, 27 1 Discrete random variables Suppose that X is a discrete random variable with range {x 1, x 2, x 3,...}, and that Y is
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationChapter 4. Continuous Random Variables
Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationMath 180C, Spring Supplement on the Renewal Equation
Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationHW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]
HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationReview for the previous lecture
Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More information3 Conditional Expectation
3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.
More informationLecture The Sample Mean and the Sample Variance Under Assumption of Normality
Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationFourier transforms. c n e inπx. f (x) = Write same thing in an equivalent form, using n = 1, f (x) = l π
Fourier transforms We can imagine our periodic function having periodicity taken to the limits ± In this case, the function f (x) is not necessarily periodic, but we can still use Fourier transforms (related
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4
More informationRandom Variables (Continuous Case)
Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions
More informationISyE 6739 Test 1 Solutions Summer 2017
1 NAME ISyE 6739 Test 1 Solutions Summer 217 This is a take-home test. But please limit the total work time to less than about 3 hours. 1. Suppose that and P(It rains today It s cold outside).9 P(It rains
More informationMathematics 426 Robert Gross Homework 9 Answers
Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationAPPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2
APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationModule 3. Function of a Random Variable and its distribution
Module 3 Function of a Random Variable and its distribution 1. Function of a Random Variable Let Ω, F, be a probability space and let be random variable defined on Ω, F,. Further let h: R R be a given
More informationMath 341: Probability Eighteenth Lecture (11/12/09)
Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams
More informationIntroduction to Statistical Data Analysis Lecture 3: Probability Distributions
Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More informationStochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet
Stochastic Processes and Monte-Carlo Methods University of Massachusetts: Spring 2018 version Luc Rey-Bellet April 5, 2018 Contents 1 Simulation and the Monte-Carlo method 3 1.1 Review of probability............................
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationSTA 4321/5325 Solution to Homework 5 March 3, 2017
STA 4/55 Solution to Homework 5 March, 7. Suppose X is a RV with E(X and V (X 4. Find E(X +. By the formula, V (X E(X E (X E(X V (X + E (X. Therefore, in the current setting, E(X V (X + E (X 4 + 4 8. Therefore,
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationUC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes
UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationMATH4210 Financial Mathematics ( ) Tutorial 7
MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra
More informationPower series solutions for 2nd order linear ODE s (not necessarily with constant coefficients) a n z n. n=0
Lecture 22 Power series solutions for 2nd order linear ODE s (not necessarily with constant coefficients) Recall a few facts about power series: a n z n This series in z is centered at z 0. Here z can
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationMetric spaces and metrizability
1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively
More informationStat 5101 Lecture Slides Deck 4. Charles J. Geyer School of Statistics University of Minnesota
Stat 5101 Lecture Slides Deck 4 Charles J. Geyer School of Statistics University of Minnesota 1 Existence of Integrals Just from the definition of integral as area under the curve, the integral b g(x)
More informationMATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours
MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY
More information