18.175: Lecture 15 Characteristic functions and central limit theorem

Size: px
Start display at page:

Download "18.175: Lecture 15 Characteristic functions and central limit theorem"

Transcription

1 18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT

2 Outline Characteristic functions

3 Outline Characteristic functions

4 Characteristic functions Let X be a random variable.

5 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ].

6 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t).

7 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t). Characteristic function φ X similar to moment generating function M X.

8 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t). Characteristic function φ X similar to moment generating function M X. φ X +Y = φ X φ Y, just as M X +Y = M X M Y, if X and Y are independent.

9 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t). Characteristic function φ X similar to moment generating function M X. φ X +Y = φ X φ Y, just as M X +Y = M X M Y, if X and Y are independent. And φ ax (t) = φ X (at) just as M ax (t) = M X (at).

10 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t). Characteristic function φ X similar to moment generating function M X. φ X +Y = φ X φ Y, just as M X +Y = M X M Y, if X and Y are independent. And φ ax (t) = φ X (at) just as M ax (t) = M X (at). And if X has an mth moment then E[X m ] = i m φ (m) X (0).

11 Characteristic functions Let X be a random variable. The characteristic function of X is defined by φ(t) = φ X (t) := E[e itx ]. Recall that by definition e it = cos(t) + i sin(t). Characteristic function φ X similar to moment generating function M X. φ X +Y = φ X φ Y, just as M X +Y = M X M Y, if X and Y are independent. And φ ax (t) = φ X (at) just as M ax (t) = M X (at). And if X has an mth moment then E[X m ] = i m φ (m) X (0). Characteristic functions are well defined at all t for all random variables X.

12 Characteristic function properties φ(0) = 1

13 Characteristic function properties φ(0) = 1 φ( t) = φ(t)

14 Characteristic function properties φ(0) = 1 φ( t) = φ(t) φ(t) = Ee itx E e itx = 1.

15 Characteristic function properties φ(0) = 1 φ( t) = φ(t) φ(t) = Ee itx E e itx = 1. φ(t + h) φ(t) E e ihx 1, so φ(t) uniformly continuous on (, )

16 Characteristic function properties φ(0) = 1 φ( t) = φ(t) φ(t) = Ee itx E e itx = 1. φ(t + h) φ(t) E e ihx 1, so φ(t) uniformly continuous on (, ) Ee it(ax +b) = e itb φ(at)

17 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t.

18 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer?

19 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)).

20 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)). Why does doubling λ amount to squaring φ X?

21 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)). Why does doubling λ amount to squaring φ X? Normal: If X is standard normal, then φ X (t) = e t2 /2.

22 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)). Why does doubling λ amount to squaring φ X? Normal: If X is standard normal, then φ X (t) = e t2 /2. Is φ X always real when the law of X is symmetric about zero?

23 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)). Why does doubling λ amount to squaring φ X? Normal: If X is standard normal, then φ X (t) = e t2 /2. Is φ X always real when the law of X is symmetric about zero? Exponential: If X is standard exponential (density e x on (0, )) then φ X (t) = 1/(1 it).

24 Characteristic function examples Coin: If P(X = 1) = P(X = 1) = 1/2 then φ X (t) = (e it + e it )/2 = cos t. That s periodic. Do we always have periodicity if X is a random integer? Poisson: If X is Poisson with parameter λ then φ X (t) = k=0 e λ λk e itk k! = exp(λ(e it 1)). Why does doubling λ amount to squaring φ X? Normal: If X is standard normal, then φ X (t) = e t2 /2. Is φ X always real when the law of X is symmetric about zero? Exponential: If X is standard exponential (density e x on (0, )) then φ X (t) = 1/(1 it). Bilateral exponential: if f X (t) = e x /2 on R then φ X (t) = 1/(1 + t 2 ). Use linearity of f X φ X.

25 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx.

26 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx. Fourier inversion: If f is nice: f (x) = 1 2π ˆf (t)e itx dt.

27 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx. Fourier inversion: If f is nice: f (x) = 1 2π ˆf (t)e itx dt. Easy to check this when f is density function of a Gaussian. Use linearity of f ˆf to extend to linear combinations of Gaussians, or to convolutions with Gaussians.

28 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx. Fourier inversion: If f is nice: f (x) = 1 2π ˆf (t)e itx dt. Easy to check this when f is density function of a Gaussian. Use linearity of f ˆf to extend to linear combinations of Gaussians, or to convolutions with Gaussians. Show f ˆf is an isometry of Schwartz space (endowed with L 2 norm). Extend definition to L 2 completion.

29 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx. Fourier inversion: If f is nice: f (x) = 1 2π ˆf (t)e itx dt. Easy to check this when f is density function of a Gaussian. Use linearity of f ˆf to extend to linear combinations of Gaussians, or to convolutions with Gaussians. Show f ˆf is an isometry of Schwartz space (endowed with L 2 norm). Extend definition to L 2 completion. Convolution theorem: If then h(x) = (f g)(x) = ĥ(t) = ˆf (t)ĝ(t). f (y)g(x y)dy,

30 Fourier inversion formula If f : R C is in L 1, write ˆf (t) := f (x)e itx dx. Fourier inversion: If f is nice: f (x) = 1 2π ˆf (t)e itx dt. Easy to check this when f is density function of a Gaussian. Use linearity of f ˆf to extend to linear combinations of Gaussians, or to convolutions with Gaussians. Show f ˆf is an isometry of Schwartz space (endowed with L 2 norm). Extend definition to L 2 completion. Convolution theorem: If then h(x) = (f g)(x) = ĥ(t) = ˆf (t)ĝ(t). f (y)g(x y)dy, Possible application? 1 [a,b] (x)f (x)dx = (1 [a,b] f )(0)=(ˆf 1 [a,b] )(0)= ˆf (t) 1 [a,b] ( t)dx.

31 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ?

32 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ? Say φ(t) = e itx µ(x).

33 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ? Say φ(t) = e itx µ(x). Inversion theorem: lim T (2π) 1 T T e ita e itb it φ(t)dt = µ(a, b) + 1 µ({a, b}) 2

34 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ? Say φ(t) = e itx µ(x). Inversion theorem: lim T (2π) 1 T T e ita e itb Main ideas of proof: Write e ita e itb I T = φ(t)dt = it it φ(t)dt = µ(a, b) + 1 µ({a, b}) 2 T T e ita e itb e itx µ(x)dt. it

35 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ? Say φ(t) = e itx µ(x). Inversion theorem: lim T (2π) 1 T T e ita e itb Main ideas of proof: Write e ita e itb I T = φ(t)dt = it it φ(t)dt = µ(a, b) + 1 µ({a, b}) 2 T T e ita e itb e itx µ(x)dt. it Observe that e ita e itb by b a. it = b a e ity dy has modulus bounded

36 Characteristic function inversion formula If the map µ X φ X is linear, is the map φ µ[a, b] (for some fixed [a, b]) a linear map? How do we recover µ[a, b] from φ? Say φ(t) = e itx µ(x). Inversion theorem: lim T (2π) 1 T T e ita e itb Main ideas of proof: Write e ita e itb I T = φ(t)dt = it Observe that e ita e itb by b a. it it φ(t)dt = µ(a, b) + 1 µ({a, b}) 2 T T e ita e itb e itx µ(x)dt. it = b a e ity dy has modulus bounded That means we can use Fubini to compute I T.

37 Bochner s theorem Given any function φ and any points x 1,..., x n, we can consider the matrix with i, j entry given by φ(x i x j ). Call φ positive definite if this matrix is always positive semidefinite Hermitian.

38 Bochner s theorem Given any function φ and any points x 1,..., x n, we can consider the matrix with i, j entry given by φ(x i x j ). Call φ positive definite if this matrix is always positive semidefinite Hermitian. Bochner s theorem: a continuous function from R to R with φ(1) = 1 is a characteristic function of a some probability measure on R if and only if it is positive definite.

39 Bochner s theorem Given any function φ and any points x 1,..., x n, we can consider the matrix with i, j entry given by φ(x i x j ). Call φ positive definite if this matrix is always positive semidefinite Hermitian. Bochner s theorem: a continuous function from R to R with φ(1) = 1 is a characteristic function of a some probability measure on R if and only if it is positive definite. Positive definiteness kind of comes from fact that variances of random variables are non-negative.

40 Bochner s theorem Given any function φ and any points x 1,..., x n, we can consider the matrix with i, j entry given by φ(x i x j ). Call φ positive definite if this matrix is always positive semidefinite Hermitian. Bochner s theorem: a continuous function from R to R with φ(1) = 1 is a characteristic function of a some probability measure on R if and only if it is positive definite. Positive definiteness kind of comes from fact that variances of random variables are non-negative. The set of all possible characteristic functions is a pretty nice set.

41 Continuity theorems Lévy s continuity theorem: if lim φ X n n (t) = φ X (t) for all t, then X n converge in law to X.

42 Continuity theorems Lévy s continuity theorem: if lim φ X n n (t) = φ X (t) for all t, then X n converge in law to X. Slightly stronger theorem: If µ n = µ then φ n (t) φ (t) for all t. Conversely, if φ n (t) converges to a limit that is continuous at 0, then the associated sequence of distributions µ n is tight and converges weakly to measure µ with characteristic function φ.

43 Continuity theorems Lévy s continuity theorem: if lim φ X n n (t) = φ X (t) for all t, then X n converge in law to X. Slightly stronger theorem: If µ n = µ then φ n (t) φ (t) for all t. Conversely, if φ n (t) converges to a limit that is continuous at 0, then the associated sequence of distributions µ n is tight and converges weakly to measure µ with characteristic function φ. Proof ideas: First statement easy (since X n = X implies Eg(X n ) Eg(X ) for any bounded continuous g). To get second statement, first play around with Fubini and establish tightness of the µ n. Then note that any subsequential limit of the µ n must be equal to µ. Use this to argue that fdµ n converges to fdµ for every bounded continuous f.

44 Moments, derivatives, CLT If x n µ(x) < then the characteristic function φ of µ has a continuous derivative of order n given by φ (n) (t) = (ix) n e itx µ(dx).

45 Moments, derivatives, CLT If x n µ(x) < then the characteristic function φ of µ has a continuous derivative of order n given by φ (n) (t) = (ix) n e itx µ(dx). Indeed, if E X 2 < and EX = 0 then φ(t) = 1 t 2 E(X 2 )/2o(t 2 ).

46 Moments, derivatives, CLT If x n µ(x) < then the characteristic function φ of µ has a continuous derivative of order n given by φ (n) (t) = (ix) n e itx µ(dx). Indeed, if E X 2 < and EX = 0 then φ(t) = 1 t 2 E(X 2 )/2o(t 2 ). This and the continuity theorem together imply the central limit theorem.

47 Moments, derivatives, CLT If x n µ(x) < then the characteristic function φ of µ has a continuous derivative of order n given by φ (n) (t) = (ix) n e itx µ(dx). Indeed, if E X 2 < and EX = 0 then φ(t) = 1 t 2 E(X 2 )/2o(t 2 ). This and the continuity theorem together imply the central limit theorem. Theorem: Let X 1, X 2,... by i.i.d. with EX i = µ, Var(X i ) = σ 2 (0, ). If S n = X X n then (S n nµ)/(σn 1/2 ) converges in law to a standard normal.

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions 18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach

More information

18.175: Lecture 17 Poisson random variables

18.175: Lecture 17 Poisson random variables 18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More

More information

18.175: Lecture 13 Infinite divisibility and Lévy processes

18.175: Lecture 13 Infinite divisibility and Lévy processes 18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility

More information

Lecture 7. Sums of random variables

Lecture 7. Sums of random variables 18.175: Lecture 7 Sums of random variables Scott Sheffield MIT 18.175 Lecture 7 1 Outline Definitions Sums of random variables 18.175 Lecture 7 2 Outline Definitions Sums of random variables 18.175 Lecture

More information

Characteristic Functions and the Central Limit Theorem

Characteristic Functions and the Central Limit Theorem Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in

More information

PROBABILITY THEORY LECTURE 3

PROBABILITY THEORY LECTURE 3 PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE

More information

18.440: Lecture 19 Normal random variables

18.440: Lecture 19 Normal random variables 18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random

More information

the convolution of f and g) given by

the convolution of f and g) given by 09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that

More information

Chapter 6. Characteristic functions

Chapter 6. Characteristic functions Chapter 6 Characteristic functions We define the characteristic function of a random variable X by ' X (t) E e x for t R. Note that ' X (t) R e x P X (dx). So if X and Y have the same law, they have the

More information

Grading: 1, 2, 3, 4, 5 (each 8 pts) e ita R. e itx dµ(x)dt = 1 2T. cos(t(x a))dt dµ(x) T (x a) 1 2T. cos(t(x a))dt =

Grading: 1, 2, 3, 4, 5 (each 8 pts) e ita R. e itx dµ(x)dt = 1 2T. cos(t(x a))dt dµ(x) T (x a) 1 2T. cos(t(x a))dt = Grading:, 2, 3, 4, 5 (each 8 pts Problem : Since e ita =, by Fubini s theorem, I T = e ita ϕ(tdt = e ita e itx dµ(xdt = e it(x a dtdµ(x. Since e it(x a = cos(t(x a + i sin(t(x a and sin(t(x a is an odd

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

Expectation, variance and moments

Expectation, variance and moments Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Notes 9 : Infinitely divisible and stable laws

Notes 9 : Infinitely divisible and stable laws Notes 9 : Infinitely divisible and stable laws Math 733 - Fall 203 Lecturer: Sebastien Roch References: [Dur0, Section 3.7, 3.8], [Shi96, Section III.6]. Infinitely divisible distributions Recall: EX 9.

More information

Section 26. Characteristic Functions. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University

Section 26. Characteristic Functions. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University Section 26 Characteristic Functions Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 3, R.O.C. Characteristic function 26- Definition (characteristic

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

The Central Limit Theorem

The Central Limit Theorem The Central Limit Theorem (A rounding-corners overiew of the proof for a.s. convergence assuming i.i.d.r.v. with 2 moments in L 1, provided swaps of lim-ops are legitimate) If {X k } n k=1 are i.i.d.,

More information

Central Limit Theorem using Characteristic functions

Central Limit Theorem using Characteristic functions Cetral Limit Theorem usig Characteristic fuctios RogXi Guo MAT 477 Jauary 20, 2014 RogXi Guo (2014 Cetral Limit Theorem usig Characteristic fuctios Jauary 20, 2014 1 / 15 Itroductio study a radom variable

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Lecture 18. Uniform random variables

Lecture 18. Uniform random variables 18.440: Lecture 18 Uniform random variables Scott Sheffield MIT 1 Outline Uniform random variable on [0, 1] Uniform random variable on [α, β] Motivation and examples 2 Outline Uniform random variable on

More information

Math 341: Probability Seventeenth Lecture (11/10/09)

Math 341: Probability Seventeenth Lecture (11/10/09) Math 341: Probability Seventeenth Lecture (11/10/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.

EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page. EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

1 Fourier Integrals of finite measures.

1 Fourier Integrals of finite measures. 18.103 Fall 2013 1 Fourier Integrals of finite measures. Denote the space of finite, positive, measures on by M + () = {µ : µ is a positive measure on ; µ() < } Proposition 1 For µ M + (), we define the

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Lecture Characterization of Infinitely Divisible Distributions

Lecture Characterization of Infinitely Divisible Distributions Lecture 10 1 Characterization of Infinitely Divisible Distributions We have shown that a distribution µ is infinitely divisible if and only if it is the weak limit of S n := X n,1 + + X n,n for a uniformly

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Solutions of differential equations using transforms

Solutions of differential equations using transforms Solutions of differential equations using transforms Process: Take transform of equation and boundary/initial conditions in one variable. Derivatives are turned into multiplication operators. Solve (hopefully

More information

Lecture 8: Continuous random variables, expectation and variance

Lecture 8: Continuous random variables, expectation and variance Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: autumn 2013 Lejla Batina Version: autumn 2013 Wiskunde

More information

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University of Massachusetts. Tuesday, January 16th, 2018

Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University of Massachusetts. Tuesday, January 16th, 2018 NAME: Advanced Analysis Qualifying Examination Department of Mathematics and Statistics University of Massachusetts Tuesday, January 16th, 2018 Instructions 1. This exam consists of eight (8) problems

More information

Math 832 Fall University of Wisconsin at Madison. Instructor: David F. Anderson

Math 832 Fall University of Wisconsin at Madison. Instructor: David F. Anderson Math 832 Fall 2013 University of Wisconsin at Madison Instructor: David F. Anderson Pertinent information Instructor: David Anderson Office: Van Vleck 617 email: anderson@math.wisc.edu Office hours: Mondays

More information

Statistics, Data Analysis, and Simulation SS 2013

Statistics, Data Analysis, and Simulation SS 2013 Statistics, Data Analysis, and Simulation SS 213 8.128.73 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 23. April 213 What we ve learned so far Fundamental

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

If α is a probability distribution on the line, its characteristic function is defined by

If α is a probability distribution on the line, its characteristic function is defined by Chapter 2 Weak Convergence 2. Characteristic Functions If α is a probability distribution on the line, its characteristic function is defined by φ(t) = exp[ itx] dα. (2.) The above definition makes sense.

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

Continuous Distributions

Continuous Distributions Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Generating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function

Generating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function Generating and characteristic functions Generating and Characteristic Functions September 3, 03 Probability generating function Moment generating function Power series expansion Characteristic function

More information

Lecture 4: Fourier Transforms.

Lecture 4: Fourier Transforms. 1 Definition. Lecture 4: Fourier Transforms. We now come to Fourier transforms, which we give in the form of a definition. First we define the spaces L 1 () and L 2 (). Definition 1.1 The space L 1 ()

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

Lecture 10. Variance and standard deviation

Lecture 10. Variance and standard deviation 18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

On the Breiman conjecture

On the Breiman conjecture Péter Kevei 1 David Mason 2 1 TU Munich 2 University of Delaware 12th GPSD Bochum Outline Introduction Motivation Earlier results Partial solution Sketch of the proof Subsequential limits Further remarks

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Math 3338, Homework 4. f(u)du. (Note that f is discontinuous, whereas F is continuous everywhere.)

Math 3338, Homework 4. f(u)du. (Note that f is discontinuous, whereas F is continuous everywhere.) Math 3338, Homework 4. Define a function f by f(x) = { if x < if x < if x. Calculate an explicit formula for F(x) x f(u)du. (Note that f is discontinuous, whereas F is continuous everywhere.) 2. Define

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15. IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random

More information

Math 341: Probability Eighteenth Lecture (11/12/09)

Math 341: Probability Eighteenth Lecture (11/12/09) Math 341: Probability Eighteenth Lecture (11/12/09) Steven J Miller Williams College Steven.J.Miller@williams.edu http://www.williams.edu/go/math/sjmiller/ public html/341/ Bronfman Science Center Williams

More information

18.175: Lecture 14 Infinite divisibility and so forth

18.175: Lecture 14 Infinite divisibility and so forth 18.175 Lecture 14 18.175: Lecture 14 Infinite divisibility and so forth Scott Sheffield MIT 18.175 Lecture 14 Outline Infinite divisibility Higher dimensional CFs and CLTs Random walks Stopping times Arcsin

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable ECE353: Probability and Random Processes Lecture 7 -Continuous Random Variable Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu Continuous

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Conditional distributions (discrete case)

Conditional distributions (discrete case) Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

ST5215: Advanced Statistical Theory

ST5215: Advanced Statistical Theory Department of Statistics & Applied Probability Wednesday, October 19, 2011 Lecture 17: UMVUE and the first method of derivation Estimable parameters Let ϑ be a parameter in the family P. If there exists

More information

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic Unbiased estimation Unbiased or asymptotically unbiased estimation plays an important role in

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example

Example continued. Math 425 Intro to Probability Lecture 37. Example continued. Example continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with

More information

17 The functional equation

17 The functional equation 18.785 Number theory I Fall 16 Lecture #17 11/8/16 17 The functional equation In the previous lecture we proved that the iemann zeta function ζ(s) has an Euler product and an analytic continuation to the

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

Mathematical Methods and its Applications (Solution of assignment-12) Solution 1 From the definition of Fourier transforms, we have.

Mathematical Methods and its Applications (Solution of assignment-12) Solution 1 From the definition of Fourier transforms, we have. For 2 weeks course only Mathematical Methods and its Applications (Solution of assignment-2 Solution From the definition of Fourier transforms, we have F e at2 e at2 e it dt e at2 +(it/a dt ( setting (

More information

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.

EE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet. EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

ECE Lecture #9 Part 2 Overview

ECE Lecture #9 Part 2 Overview ECE 450 - Lecture #9 Part Overview Bivariate Moments Mean or Expected Value of Z = g(x, Y) Correlation and Covariance of RV s Functions of RV s: Z = g(x, Y); finding f Z (z) Method : First find F(z), by

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information