Continuous Random Variables

Similar documents
Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

Random Vectors Part A

ECE 6540, Lecture 06 Sufficient Statistics & Complete Statistics Variations

Lecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher

Worksheets for GCSE Mathematics. Algebraic Expressions. Mr Black 's Maths Resources for Teachers GCSE 1-9. Algebra

Integrating Rational functions by the Method of Partial fraction Decomposition. Antony L. Foster

Angular Momentum, Electromagnetic Waves

7.3 The Jacobi and Gauss-Seidel Iterative Methods

PHY103A: Lecture # 4

Mathematics Ext 2. HSC 2014 Solutions. Suite 403, 410 Elizabeth St, Surry Hills NSW 2010 keystoneeducation.com.

Unit II. Page 1 of 12

Lecture No. 5. For all weighted residual methods. For all (Bubnov) Galerkin methods. Summary of Conventional Galerkin Method

Work, Energy, and Power. Chapter 6 of Essential University Physics, Richard Wolfson, 3 rd Edition

CHAPTER 5 Wave Properties of Matter and Quantum Mechanics I

Exam Programme VWO Mathematics A

Time Domain Analysis of Linear Systems Ch2. University of Central Oklahoma Dr. Mohamed Bingabr

Rotational Motion. Chapter 10 of Essential University Physics, Richard Wolfson, 3 rd Edition

Math 171 Spring 2017 Final Exam. Problem Worth

Mathematical Statistics 1 Math A 6330

Monte Carlo (MC) simulation for system reliability

(1) Correspondence of the density matrix to traditional method

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016

On The Cauchy Problem For Some Parabolic Fractional Partial Differential Equations With Time Delays

Expectation Propagation performs smooth gradient descent GUILLAUME DEHAENE

Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics. Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell

International Journal of Scientific & Engineering Research, Volume 7, Issue 8, August ISSN MM Double Exponential Distribution

Analysis of Survival Data Using Cox Model (Continuous Type)

General Strong Polarization

Q. 1 Q. 25 carry one mark each.

General Strong Polarization

CHAPTER 2 Special Theory of Relativity

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

M.5 Modeling the Effect of Functional Responses

PART I (STATISTICS / MATHEMATICS STREAM) ATTENTION : ANSWER A TOTAL OF SIX QUESTIONS TAKING AT LEAST TWO FROM EACH GROUP - S1 AND S2.

General Strong Polarization

Wave Motion. Chapter 14 of Essential University Physics, Richard Wolfson, 3 rd Edition

General Strong Polarization

SECTION 7: FAULT ANALYSIS. ESE 470 Energy Distribution Systems

Gibbs Sampler Derivation for Latent Dirichlet Allocation (Blei et al., 2003) Lecture Notes Arjun Mukherjee (UH)

Introduction to Density Estimation and Anomaly Detection. Tom Dietterich

Classical RSA algorithm

Chapter 3 Common Families of Distributions

SECTION 5: CAPACITANCE & INDUCTANCE. ENGR 201 Electrical Fundamentals I

Estimate by the L 2 Norm of a Parameter Poisson Intensity Discontinuous

A brief summary of the chapters and sections that we cover in Math 141. Chapter 2 Matrices

Non-Parametric Weighted Tests for Change in Distribution Function

Worksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra

Review for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa

TEXT AND OTHER MATERIALS:

Lecture 11. Kernel Methods

Chemical Engineering 412

Review for Exam Hyunse Yoon, Ph.D. Adjunct Assistant Professor Department of Mechanical Engineering, University of Iowa

Approximate Second Order Algorithms. Seo Taek Kong, Nithin Tangellamudi, Zhikai Guo

Probability Distributions Columns (a) through (d)

Math, Stats, and Mathstats Review ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

International Journal of Mathematical Archive-5(3), 2014, Available online through ISSN

Diffusive DE & DM Diffusve DE and DM. Eduardo Guendelman With my student David Benisty, Joseph Katz Memorial Conference, May 23, 2017

PHY103A: Lecture # 9

Lecture 6. Notes on Linear Algebra. Perceptron

Physics 371 Spring 2017 Prof. Anlage Review

3.2 A2 - Just Like Derivatives but Backwards

10.4 The Cross Product

Secondary 3H Unit = 1 = 7. Lesson 3.3 Worksheet. Simplify: Lesson 3.6 Worksheet

Phy207 Final Exam (Form1) Professor Zuo Fall Signature: Name:

Campus Academic Resource Program Differentiation Rules

1 Review of Probability

Chapter 5. Chapter 5 sections

(1) Introduction: a new basis set

Inferring the origin of an epidemic with a dynamic message-passing algorithm

TECHNICAL NOTE AUTOMATIC GENERATION OF POINT SPRING SUPPORTS BASED ON DEFINED SOIL PROFILES AND COLUMN-FOOTING PROPERTIES

CHAPTER 4 Structure of the Atom

STAT 3610: Review of Probability Distributions

IT 403 Statistics and Data Analysis Final Review Guide

Control of Mobile Robots

Things to remember when learning probability distributions:

MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

PHL424: Nuclear Shell Model. Indian Institute of Technology Ropar

10.4 Controller synthesis using discrete-time model Example: comparison of various controllers

Specialist Mathematics 2019 v1.2

A Posteriori Error Estimates For Discontinuous Galerkin Methods Using Non-polynomial Basis Functions

A new procedure for sensitivity testing with two stress factors

Data Preprocessing. Jilles Vreeken IRDM 15/ Oct 2015

Lecture No. 1 Introduction to Method of Weighted Residuals. Solve the differential equation L (u) = p(x) in V where L is a differential operator

Module 4 (Lecture 16) SHALLOW FOUNDATIONS: ALLOWABLE BEARING CAPACITY AND SETTLEMENT

International Journal of Mathematical Archive-5(1), 2014, Available online through ISSN

Math 3 Unit 4: Rational Functions

PHL424: Feynman diagrams

1. The graph of a function f is given above. Answer the question: a. Find the value(s) of x where f is not differentiable. Ans: x = 4, x = 3, x = 2,

2.4 Error Analysis for Iterative Methods

Charge carrier density in metals and semiconductors

SECTION 5: POWER FLOW. ESE 470 Energy Distribution Systems

F.4 Solving Polynomial Equations and Applications of Factoring

Worksheets for GCSE Mathematics. Solving Equations. Mr Black's Maths Resources for Teachers GCSE 1-9. Algebra

A Little Aspect of Real Analysis, Topology and Probability

The Bose Einstein quantum statistics

Review for Exam Hyunse Yoon, Ph.D. Adjunct Assistant Professor Department of Mechanical Engineering, University of Iowa

3 Modeling Process Quality

SVARs & identification, and advanced time series (continuous time processes and the frequency domain)

1.1 Review of Probability Theory

Single Server Bulk Queueing System with Three Stage Heterogeneous Service, Compulsory Vacation and Balking

Transcription:

Continuous Random Variables Page Outline Continuous random variables and density Common continuous random variables Moment generating function

Seeking a Density Page A continuous random variable has an uncountable sample space Absolutely continuous rv has an absolutely continuous cdf FF satisfying for finitely many disjoint intervals (aa ii, bb ii ), there exists εε for every given δδ such that ii bb ii aa ii δδ implies ii FF(bb ii ) FF(aa ii ) εε Absolutely continuous functions are continuous. Continuity condition above for interval. are differentiable almost everywhere. map length-0 intervals length-0 intervals has a corresponding ff called density such that FF aa = aa ff uu dddd. Absolutely continuous: Nothing Nothing Singularity: Nothing Everything Singularly continuous function or random variable: Function: Continuous functions that increase only over sets whose total length is zero. Random variable: Sample space is uncountable but the range of the rv is a set with zero length. E.g., making a probability mass of from an uncountable rv range of 0 length is singularity» Cantor set: Uncountable, so it can serve as the range of a continuous rv.» Cantor function: Cantor set [0,], so it maps length 0 length.

Continuous but Not Absolutely Continuous Cantor Set Cantor Function Cantor Random Variable Page 3.Cantor Set Exclude from Cantor Set Exclude from Cantor Set Exclude from Cantor Set Exclude from Cantor Set Exclude from Cantor Set Exclude from Cantor Set Exclude from Cantor Set 3. Cantor function as a cdf yields Cantor random variable that is singularly continuous Cantor function is constant outside Cantor Set increasing inside Cantor Set continuous over all.cantor Function 8/9 6/7 /3 8/9 Chalice (99): Any increasing real-valued function on [0,] that has a) FF 0 = 0, b) FF xx/3 = FF(xx)/, c) FF xx = FF(xx) is Cantor function.

Finding Many Densities Page 4 In this course usually, absolutely continuous rv = absolutely continuous rv Pdf ff is derived from cdf FF Recall the sequence of probability construction: (Ω, I, P) and then FF aa : = P(XX aa) Does each FF lead to a unique ff? No! Ex: Consider ff 0 xx = for 0 xx and over the same range of xx define a family of functions parametrized as ff aa xx = ff 0 xx + aa II xx=aa for 0 aa. II AA is a binary variable taking the value of only when AA is correct. xx» Note that ffaa uu dddd = xx = FF(xx) for 0 xx, because integral is not affected by the alterations at countably many points or at the single point {aa}.» Each function in the family {ff aa : 0 aa } is a density for FF xx = II 0 xx xx. Versions of the last example are the same almost everywhere and differ over a point. Can we generate examples that have versions differing over an interval rather than a single point?» No, because ff = ff almost everywhere (over all intervals) if and only if AA ff xx dddd = AA ff xx dddd for every set AA. Can we generate examples that have continuous versions?» No, because continuous versions must differ over an interval.

Density and Moments Page 5 Probability density function pdf ff XX xx = lim uu xx (FF XX xx FF XX uu )/(xx uu) define almost everywhere 0 = P XX ωω = xx ff XX xx = arbitrary at countably many points. Let AA be the range of rv:» Which equality does fail in = P Ω = P xx AA XX ωω = xx = xx A P XX ωω = 0? Others are similar to those of discrete rvs: Expected value E XX : E XX = xx xxff XX xx dddd The k th moment: E XX kk = xx xx kk ff XX xx dddd Variance V XX : V XX = E XX - E XX Ex: For a continuous random variable XX, let f u = II 0 u (ccuu + uu).what value of cc makes ff a legitimate density so that it integrates to one? Find E XX. = FF = 0 ccuu + uu dddd = cc 3 uu3 + uu uu= uu=0 = cc + so cc = 3. 3 3 E XX = 0 uu3 + uu dddd = 3 8 uu4 + 3 uu3 uu= uu=0 = 3 + = 7. 8 3 4

Independence Page 6 Random variables XX and YY are independent if P XX aa, YY bb = P XX aa P YY bb This definition is inherited from the independence of events P XX A, YY BB = P XX AA P YY BB where AA = (, aa] and BB =, bb. Ex: For two independent variables XX, XX, we have E XX XX = E XX E XX E XX XX = xx,xx xx xx ff(xx )ff xx ddxx ddxx = xx xx ff(xx )ddxx xx xx ff(xx )ddxx = E XX E XX Ex: A random variable is not independent of itself so E X = E XXXX E XX E XX Ex: For two independent variables XX, XX, we have V XX + XX = V XX + V XX

Common Continuous Random Variables Uniform Random Variable A uniform random variable XX takes values between xx uu and xx oo, and the probability of XX being in any subinterval of [xx uu, xx oo ] of a given length δδ is the same: P aa XX aa + δδ = P bb XX bb + δδ for xx uu aa, bb aa + δδ, bb + δδ xx oo. Uniform random variable over [xx uu, xx oo ] is denoted by UU(xx uu, xx oo ). Ex: What is the pdf of a uniform random variable over [xx uu, xx oo ]? Since P aa XX aa + δδ = P bb XX bb + δδ, the density ff uu = cc must be constant over [xx uu, xx oo ]. xx We need = FF(xx oo ), that is = oo xxuu cccccc = c x o x u. Hence, ff uu = II xxuu uu xx oo xx oo xx uu Find E(UU(xx uu, xx oo )). E UU xx uu, xx oo xx = oo xxuu uu dddd = uu uu=xx oo xx oo xx uu xx oo xx uu uu=xx uu = xx uu+xx oo. Find P(XX = UU 0, XX = UU(0,)). Note each X i = UU 0, is independent and identical. Page 7 xx xx xx P XX XX = xx xx ddxx ddxx = 0 0 xx ddxx ddxx = xx ddxx = xx xx = = xx =0 0 xx

Common Continuous Random Variables Exponential Random Variable Page 8 A uniform random variable XX takes values between 0 and, and the probability of XX being in any subinterval of [aa, aa + δδ] of a given length δδ exponentially decreases as aa increases: P aa XX aa + δδ = exp λλ aa bb P bb XX bb + δδ for 0 aa bb and δδ 0. Here λλ is the parameter of the distribution. Start with the equality above, and first δδ and then aa = 0: P XX bb = exp( λλλλ) The cdf and pdf are respectively FF XX xx = II xx 0 [ exp λλxx ] and ff XX xx = II xx 0 λλexp( λλλλ) Ex: Find moments of Exponential distribution E XX = 0 xx λλλλλ λλλλ = xxxxx λλλλ 0 exp λλλλ + 0 exp λλλλ dddd = 0 λλ 0 = λλ E XX = 0 xx λλλλλ λλλλ = xx exp λλλλ 0 + 0 xx exp λλλλ dddd = 0 + xxλλλλλ λλλλ dddd = As an induction hypothesis suppose that E XX kk = kk!/λλ kk. Hypothesis holds for kk =. It also holds for k =, which we consider for getting intuition. The step of k = can be dropped. E XX kk+ = 0 xx kk+ λλλλλ λλλλ = xx kk+ exp λλλλ 0 + 0 kk + xx kk exp λλλλ dddd = 0 + kk+ λλ 0 xx kk λλλλλ λλλλ dddd = kk+ λλ E(XXkk ) = kk+! λλ kk+ The last line establishes the induction hypothesis for kk +. Exponential distribution has memoryless property: P XX > aa + bb = P(XX > aa)p(xx > bb). If a machine is functioning at time aa (has not failed over [0, aa]), the probability that it will be functioning at time aa + bb (will not fail over [aa, aa + bb]) is the same probability that it functioned for the first bb time units after its installation. λλ 0 λλ

Common Continuous Random Variables Pareto Random Variable Page 9 Two pdfs Slower decay Exponential decay Ex: Find E XX and moments αααααα For αα >, E XX = xxuu xx uu αα xxαα+ dddd = xxαα uu αα xx αα xx uu = αα αα xx uu xx For αα =, E XX = xxuu xx uu dddd = xx xx uuln xx xxαα + xxuu. For αα <, E XX = αα xxuu xx uu dddd > αα xx xx αα+ xx xx uu uu xx+ dddd Since 0 xxxx xx dddd is finite and xx kk xx for xx, diverging mean implies diverging moments Heavy tail causes diverging moments. For slower decay in the tail P(XX xx), note exp λλλλ xx aa or exp λλλλ xx aa for sufficiently large xx. Slower decay yields a heavy tail distribution Seek a random variable with P(XX xx) proportional to xx αα for xx xx uu and αα 0. Since P XX xx uu =, set P XX xx = (xx uu /xx) αα for xx xx uu. The cdf and pdf are respectively FF XX xx = II xx xxuu [ (xx uu /xx) αα ] and ff XX xx = II xx xxuu Heavy tail in practice Wealth distribution in a population: there are plenty of individuals with very large amount of wealth. Phone conversation duration: there are plenty of phone conversations with a very long duration. Number of tweets sent by an individual: there are plenty of individuals sending a large number of tweets. Number of failures caused by a root cause: there are plenty of root causes leading to a large number of failures. This is known as Pareto rule in Quality Management. αααα uu αα xx αα+

Gamma Function For nonnegative integer nn, nn! = nn nn nn and 0! =. This factorial definition does not help for real numbers or negative integers. Let us extend the definition. Let us define Gamma function for any real number nn, Then Γ nn + = 0 xx nn exp xx dddd Γ nn = 0 xx nn exp xx dddd Page 0 = 0 0 nnxx nn exp xx dddd = nn 0 xx nn exp xx dddd = nnγ nn Γ = 0 exp xx dddd = Ex: Γ = 0 xx / exp xx dddd = 0 exp uu uu uu dddd = 0 exp uu dddd = ππ Ex: Γ 7 = 5 Γ 5 = 5 = 5 3 Γ 3 Γ 3 = 5 ππ 8 Ex: Γ 0 =, by lower bounding the limit with a series diverging to.

Common Continuous Random Variables Erlang and Gamma Random Variables An Erlang rv is the sum of an integer number of exponential random variables. Page Ex: Add up two independent XX, XX EEEEEEEE(λλ), what is the cdf of XX = XX + XX? P XX xx = P XX + XX xx = 0 xx P XX xx uu λλλλλ λλuu dddd xx = ( exp λλ(xx uu) )λλλλλ λλuu dddd = 0 0 = exp λλλλ + λλλλλλλ( λλλλ) xx λλexp λλλλ dddd 0 xx λλλλλ λλxx dddd ff XX xx = dd dddd P XX xx = λλλλλ λλλλ λλλλλ λλλλ + λλ exp λλλλ = λλλλλ λλλλ λλλλ. xx xx xx + xx xx λλλλλ λλλλ 0 uu xx P XX xx uu xx xx Sum of αα many independent EEEEEEEE(λλ) has a distribution parameterized by αα, λλ. The pdf of Gamma αα, λλ : ff XX xx = λλλλλ λλλλ λλλλ αα Γ(αα) for xx 0 and αα > 0 Ex: UTD's SOM has routers at every floor. Since the service is provided without an interruption, a failed router is replaced immediately. We are told that the first floor had 7 failed routers and running the 8th now. On the other hand, the second floor had 4 routers and running the 5th now. Suppose all routers are identical with Expo(λλ) lifetime with parameter λλ measured in /year. P(Gamma(8,λλ) 0) represents the probability that the routers on the first floor last for 0 years. If the SOM building is 8 years old, then revise probability as P(Gamma(8,λλ) 0 Gamma(8,λλ) 8). For the nd floor, P(Gamma(5,λλ) 0) and P(Gamma(5,λλ) 0 Gamma(5,λλ) 8). For integer αα, find E(Gamma(αα,λλ)) and V(Gamma(αα,λλ)). We can start with the first two moments of Exponential random variable: E(Expo(λλ))=/λλ and E(Expo λλ )=/λλ, which imply V(Expo(λλ))=/λλ. Since αα is integer, we can apply the formulas for the expected value and variance of sum of independent random variables: E(Gamma(αα, λλ))=αα/λλ and V(Gamma(αα, λλ))=αα/ λλ.

Common Continuous Random Variables Beta Random Variable Page Beta function BB mm, kk = 0 uu mm uu kk dddd = 0 uu kk uu mm dddd = BB(kk, mm). BB kk, mm = Γ kk Γ(mm) Γ kk+mm obtained in the notes using change of variables. It is used to normalize. Consider Bernoulli trials with constant but unknown success probability PP is. Suppose nn trials yield mm successes. Given mm successes, update the distribution of PP, we are seeking P PP = pp BBBBBB nn, PP = mm = P PP=pp,BBBBBB nn,pp =mm P Bin n,pp =m = P PP=pp,BBBBBB nn,pp =mm 0<uu< P PP=uu,Bin n,pp =m Replace P(PP = uu, BBBBBB(nn, PP) = mm) by ff PP (uu)p(bbbbbb(nn, uu) = mm) which leads to P Bin n, PP = m = 0<uu< P PP = uu, BBBBBB nn, PP = mm 0 where ff PP is the pdf of PP. You can skip the middle step by conditioning. ff PP uu P BBBBBB nn, uu = mm dddd For XX PP BBBBBB nn, PP = mm ff XX pp = ff PP pp P BBBBBB nn, pp = mm ffpp 0 uu P BBBBBB nn, uu = mm dddd = ff PP pp CC nn mm pp mm pp nn mm ffpp 0 uu CC nn mm uu mm uu nn mm dddd = ff PP pp ppmm nn mm pp ffpp 0 uu uu mm uu nn mm dddd To simplify further suppose ff PP is uniform over [0,], uninformative prior, ff XX pp = ppmm pp nn mm 0 uu mm uu nn mm dddd = ppmm pp nn mm BB(mm+,nn mm+) In general, Beta random variable ff XX xx = xxαα xx λλ, posterior probability of success given mm successes out of nn B(αα,λλ) for 0 xx and αα, λλ > 0

Expected Value of Beta Random Variable and Interpretation Page 3 Ex: E XX = 0 xx xx αα xx λλ BB(αα,λλ) dddd = BB αα,λλ Γ αα+λλ Γ αα+ Γ λλ BB αα +, λλ = = αα, ratio of successes Γ αα Γ(λλ) Γ αα+λλ+ αα+λλ Beta random variable with αα successes and λλ failures is ff XX xx = xx xxαα xx λλ dddd 0 BB(αα, λλ) The corresponding expected value of probability of success is αα Number of successes + Number of successes = αα + λλ Number of trials + Number of trials Question: Why does the expected value differ from the ratio of the number of successes to the number of trials? Answer: Prior distribution plays a role. Consider two special cases:» No trials: What is the expected value of probability of success before a trial, i.e., αα = λλ =? You expect 0.5= E(UU 0, ) from the prior, and this is what you get from the formula αα Number of successes + = = αα + λλ Number of trials +» Infinitely many trials. What is the expected value of probability of success after infinitely many trials, i.e., αα, λλ? Since the success probability is constant pp uu but unknown, we expect ratio of number of successes to number of trials to converge to this constant: lim lim nn Number of successes pp uu nn Inserting this into the formula, Number of successes + Number of trials + nn P as nn BBBBBB nn,ppuu pp Number of trials + = lim nn Number of trials + = lim nn nn pp uu εε = for εε > 0. So pp uu nn + nn + = ppuu Hence the expected value of success probability converges to the true parameter pp uu after infinite trials. You can say that the effect of prior UU 0, vanishes after infinite trials.

Common Continuous Random Variables Normal Random Variable Normal density is symmetric around its mean (mode) and has light tails. The range for the normal random variable NNNNNNNNNNNN(μμ, σσ ) is all real numbers and the density is ff XX xx = xx μμ exp ππ σσ σσ The cdf can be computed only numerically. Ex: The density integrates to. xx μμ First note exp dddd = ππ σσ σσ ππ zz exp Also exp zz The last equality is from transformation to polar coordinates ddzz exp zz rr θθ zz zz dddd ddzz = exp zz +zz rr = zz + zz θθ = arccos zz rr with zz = xx μμ σσ ddzz ddzz = 0 ππ 0 rrrrr rr dddddddd Page 4 Finally, 0 ππ 0 rrrrr rr Ex: E XX = xx = μμ exp ππ σσ ππ exp dddddddd = 0 ππ dddd 0 rrrrr rr xx μμ σσ dddd = (μμ + zzzz) zz ddzz + σσ zz ππ exp dddd = ππ 0 exp uu dddd = ππ. zz ππ zz exp ddzz = μμ = =0 integrating gg zz = gg zz over ( aa, aa) ddzz

Common Continuous Random Variables Normal Random Variable Page 5 Ex: What is the distribution of aaaa + bb if XX is NNNNNNNNNNNN μμ, σσ. We can consider P aaaa + bb uu = P XX uu bb aa dd dduu P XX uu bb aa uu bb = dd dduu aa and differentiate with respect to uu ππ σσ xx μμ exp dddd = uu bb aaaa exp σσ ππ aaaa aa σσ The last equality is from differentiating an integral, aka Leibniz rule. We obtain the density for NNNNNNNNNNNN aaaa + bb, aa σσ. Ex: Partial expected value for XX NNNNNNNNNNNN μμ, σσ up to aa. E XXI XX aa aa = xx ππ σσ exp xx μμ σσ = μμμμ NNNNNNNNNNNN 0, aa μμ σσ + σσ aa μμ σσ dddd = (μμ + zzzz) ππ exp aa μμ σσ / ππ exp uu dduu zz ddzz = μμμμ NNNNNNNNNNNN 0, aa μμ σσ σσ ππ = μμμμ NNNNNNNNNNNN 0, aa μμ σσ exp uu σσff NNNNNNNNNNNN 0, aa μμ σσ aa μμ σσ /

Common Continuous Random Variables Lognormal Random Variable XX is lognormal random variable if XX = exp(yy) for YY NNNNNNNNNNNN(μμ, σσ ) The range for lognormal is nonnegative reals. This is an advantage for modelling price, demand, etc. Ex: Find the pdf of XX Ex: Find the moment E(XX kk ). P XX xx = P YY ln xx = ff XX xx = dd dddd P XX xx = xx E XX kk = E exp kkkk = exp kkyy = exp kkμμ exp kkuu ππ σσ exp ππ σσ ln xx ππ σσ ππ σσ exp yy μμ σσ exp yy μμ σσ exp ln xx μμ σσ dddd uu σσ dduu with uu = yy μμ dddd Page 6 = exp kkkk exp kk σσ exp (uu kkσσ ) dddd = exp kkkk exp kk σσ ππ σσ σσ = Finally, E XX kk = exp kkkk + kkkk In particular E XX = exp μμ + σσ & V XX = exp μμ + σσ exp μμ + σσ = exp μμ + σσ exp σσ

Application: Price Evolution with Lognormal Prices in Every Period Page 7 Looking at future from now Forward price p(t-,t) Forward price Spot price t- p(t,t) t T p(t,t) Multiplicative Evolution p( t, T ) = ε p( t, T ) p( t +, T ) = εt p( t, T )... p( T, T ) p( T, T ) p( T, T ) p( T, T ) = ε = ε t, T t = : ε T ( t ) p( T, T ) = ε p( T = ε ε p( T =... = ε ε... ε, T ), T ) T t = lnp( T, T ) lnp( t, T ) i= p( t, T ) T t lnε i Each lnε Normal. i Price is Exponential Brownian Motion

Aug p(aug,dec)=$00 Forward Price Evolution Example Forward prices for December from months Aug-Dec: p(sep, Dec) = ε p(aug, Dec) 4 4 ε = 95 /00 p(oct, Dec) = ε p(sep, Dec) Sep p(sep,dec)=$95 3 ε 3 = 9 / 95 Oct p(oct,dec)=$9 p( T = Dec, T = Dec) = ε p( T = Nov, T Page 8 = Dec) ε = /04 Dec Nov p(dec,dec)=$ p(nov,dec)=$04 p(nov, Dec) = ε p(oct, Dec) ε = 04 / 9 Forward prices for December from months Aug-Dec: lnε = ln( /04); lnε = ln(04 / 9); lnε 3 = ln(9 / 95); lnε 4 = ln(95/00). Forward prices for November from months Jul-Nov, using similar process data not shown explicitly: lnε = ln(05/0); lnε = ln(0/ 90); lnε 3 = ln(90 / 93); lnε 4 = ln(93/ 98). 4 normal distributions fit to 4 logarithms of epsilons. Good fit Lognormal prices. Future (forward) price model from 0 to tt: pp tt = LLLLLLLLLLLLLLLLLL rrrr pp(0) or SS tt = XX tt ss 0 XX tt = εε εε εε tt where each epsilon is lognormal or so is their multiplication XX tt tt ln XX tt = ii= ln εε ii, sum of tt identical normal rvs is also normal ln XX tt is normal with parameters tttt and ttσσ if each ln εε ii is NNNNNNNNNNNN μμ, σσ E ss tt = ss 0 E XX tt = ss 0 exp tttt + ttσσ & VV SS tt = ss 0 V XX tt = ss 0 (exp tttt + ttσσ exp tttt + ttσσ )

Mixtures of Random variables Page 9 Random variables derived from others via max/min operators can have masses at the end of their ranges. Product returns due to unmet expectations to a retailer can be treated as negative demand, then the demand DD R. If you want only the new customer demand for products, you should consider max{0, DD} which has a mass at 0. Ex: Consider the mixture of a mass of qq at zero with eeeeeeee λλ random variable whose cdf is qq if zz = 0 FF ZZ zz = qq + qq exp λλλλ if zz > 0. Given rvs XX, YY, the mixture ZZ can be obtained as XX with probability pp ZZ = YY with probability pp Then we have FF ZZ zz = ppff XX zz + pp FF YY (zz). Mixing allows us to create new rvs and obtain richer examples. Random variable decomposition: Each rv = λλ DD [A discrete rv] + λλ SSSS [A singularly continuous rv] + λλ AAAA [An absolutely continuous rv] for λλ DD + λλ SSSS + λλ AAAA =.

Moment Generating Functions Page 0 Random variable XX can be described uniquely by all of its moments: E XX, E XX,, E XX kk, A function that yields all of the moments uniquely identifies the associated random variable A such function is moment generating function mm XX tt = E exp tttt mm XX (tt) = Since exp tttt = + tttt + tttt mm XX tt =! xx= exp tttt pp XX (xx) and mm XX (tt) = exp tttt ffxx xx dddd + tttt 3 3! + + + tttt + tttt = ii=! tttt ii ii! tt ii ii! E XXi Recovering moments from a given mgf mm XX (tt): dd kk ddtt kk mm XX tt = ii=0 E XX ii ii! + + tttt 3 3! dd kk ddtt kk ttii = E XX kk + ii=kk+ = E XX kk + ii=kk+ + + tttt ii ii! + pp XX xx E XX ii ii ii (ii kk + ) ii! E XX ii dd kk E XX kk = ddtt kk mm XX tt tt=0 tt ii kk (ii kk)! tt ii kk

Examples of MGFs Page Ex: If XX and YY are two independent random variables with mm XX and mm YY, what is the moment generating function mm XX+YY of the sum of these variables? We proceed directly as mm XX+YY tt = E(exp(tt(XX+YY))) = E(exp(tttt))E(exp(tttt)) = mm XX tt mm YY (tt) where independence of exp(tttt) and exp(tttt) is used in the middle equality. Ex: Find the moment generating function for the Bernoulli random variable. Use this function to obtain the moment generating function of the Binomial random variable. For a Bernoulli random variable XX ii with success probability pp, mm XXii tt = E exp ttxx ii = exp 0tt pp + exp tt pp = pp + pp exp(tt) Binomial random variable BB(nn, pp) is a sum of nn Bernoulli random variables so nn mm ii= nn XXii tt = ii= mm XXii tt = pp + pp exp tt Ex: For an exponential random variable XX find the mgf mm XX (tt) & the moments E(XX kk ). Then mm XX tt = E ee tttt = ee tttt λλee λλλλ dddd = λλ ee λλ tt xx dddd = λλ 0 0 E XX kk dd kk = ddtt kk mm XX tt tt=0 = ddtt kk λλ tt tt=0 λλ ddkk λλ tt nn ee λλ tt xx λλ = 0 λλ tt = λλ kk! λλ tt kk = kk! tt=0 λλ kk Integral converges for tt < λλ

Examples of MGFs Page Ex: Find the mgf for Gamma(nn, λλ) for integer nn. mm GGGGGGGGGG nn,λλ (tt) = mm EEEEEEEE λλ tt = nn ii= λλ λλ tt nn for tt < λλ Ex: Find the mgf for XX = Normal(μμ, σσ ) mm Normal μμ,σσ tt = ee tttt ππσσ = ee μμμμ+0.5σσ tt = ee μμμμ+0.5σσ tt ee0.5 xx μμ /σσ Ex: Find the mgf mm Normal μμ,σσ +Normal μμ,σσ (tt) ππ ee0.5 zz σσσσ dddd = ee μμμμ ππ eeσσσσσσ ee 0.5zz ddzz mm Normal μμ,σσ +Normal μμ,σσ tt = mm Normal μμ,σσ tt mm Normal μμ,σσ tt = ee μμ tt+0.5σσ tt ee μμ tt+0.5σσ tt = ee (μμ +μμ )tt+0.5(σσ +σσ )tt = mm Normal μμ +μμ,σσ +σσ tt dddd xx = μμ + σσσσ 0.5zz + σσσσσσ = 0.5 zz σσσσ + 0.5σσ tt Sum of independent normal random variables is another normal random variable whose parameters are obtained by summing mean and variance parameters.

Some functions are not MGF Page 3 We know that mm XX tt = 0 =. Question: Is each function with mm XX 0 = an mgf for a random variable. Answer: No, by counterexample. The following satisfies mm XX 0 = but is not an mgf mm tt = + ii= tt ii ii! If there is a rv X, we must have through differentiating mm,» E XX =, E XX = and in general E XX kk = kk» Let YY = XX 3 a new random variable.» We have V YY = E YY E YY = E XX 6 E XX 3 = 6 3 = 3 < 0» This is a contradiction

Summary Page 4 Continuous random variables and existence/uniqueness of density Common continuous random variables Moment generating function Space of Discrete RVs Sum Bernoulli Wait until success Geometric Hypergeometric Limit Binomial Limit Poisson Condition Uniform Beta Space of Absolutely Continuous RVs Prior/ Posterior Pareto Heavier Tail Sum Exponential Normal Lighter Tail exp Lognormal Erlang Gamma Space of Singularly Continuous RVs Any RV