Transform Techniques. Moment Generating Function

Similar documents
th m m m m central moment : E[( X X) ] ( X X) ( x X) f ( x)

Moment Generating Function

5. Stochastic processes (1)

Statistical Distributions

Chapter 3 Common Families of Distributions

Vehicle Arrival Models : Headway

EXPONENTIAL PROBABILITY DISTRIBUTION

Section 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients

Discrete Markov Processes. 1. Introduction

Block Diagram of a DCS in 411

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University

Stochastic models and their distributions

Basic notions of probability theory (Part 2)

Stochastic Structural Dynamics. Lecture-6

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references.

2 int T. is the Fourier transform of f(t) which is the inverse Fourier transform of f. i t e

Avd. Matematisk statistik

556: MATHEMATICAL STATISTICS I

in Engineering Prof. Dr. Michael Havbro Faber ETH Zurich, Switzerland Swiss Federal Institute of Technology

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency)

BBP-type formulas, in general bases, for arctangents of real numbers

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Introduction to Probability and Statistics Slides 4 Chapter 4

Homework 4 (Stats 620, Winter 2017) Due Tuesday Feb 14, in class Questions are derived from problems in Stochastic Processes by S. Ross.

ON THE NUMBER OF FAMILIES OF BRANCHING PROCESSES WITH IMMIGRATION WITH FAMILY SIZES WITHIN RANDOM INTERVAL

Representation of Stochastic Process by Means of Stochastic Integrals

Physics 127b: Statistical Mechanics. Fokker-Planck Equation. Time Evolution

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

Lecture 4: Processes with independent increments

2.3 SCHRÖDINGER AND HEISENBERG REPRESENTATIONS

10. State Space Methods

Internet Traffic Modeling for Efficient Network Research Management Prof. Zhili Sun, UniS Zhiyong Liu, CATR

Sensors, Signals and Noise

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Cash Flow Valuation Mode Lin Discrete Time

An random variable is a quantity that assumes different values with certain probabilities.

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

ADVANCED MATHEMATICS FOR ECONOMICS /2013 Sheet 3: Di erential equations

6.003: Signal Processing

CS Homework Week 2 ( 2.25, 3.22, 4.9)

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Solutions to the Exam Digital Communications I given on the 11th of June = 111 and g 2. c 2

Unit Root Time Series. Univariate random walk

Approximation Algorithms for Unique Games via Orthogonal Separators

For example, the comb filter generated from. ( ) has a transfer function. e ) has L notches at ω = (2k+1)π/L and L peaks at ω = 2π k/l,

Differential Equations

Stochastic Model for Cancer Cell Growth through Single Forward Mutation

4.5 Constant Acceleration

Math 10B: Mock Mid II. April 13, 2016

Basic definitions and relations

CHAPTER 12 DIRECT CURRENT CIRCUITS

f(s)dw Solution 1. Approximate f by piece-wise constant left-continuous non-random functions f n such that (f(s) f n (s)) 2 ds 0.

KEY. Math 334 Midterm I Fall 2008 sections 001 and 003 Instructor: Scott Glasgow

FINM 6900 Finance Theory

THE BERNOULLI NUMBERS. t k. = lim. = lim = 1, d t B 1 = lim. 1+e t te t = lim t 0 (e t 1) 2. = lim = 1 2.

Asymptotic Equipartition Property - Seminar 3, part 1

Solution of Assignment #2

Linear Cryptanalysis

The Arcsine Distribution

13.3 Term structure models

20. Applications of the Genetic-Drift Model

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

LAPLACE TRANSFORM AND TRANSFER FUNCTION

STRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN

A FAMILY OF MARTINGALES GENERATED BY A PROCESS WITH INDEPENDENT INCREMENTS

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen

Right tail. Survival function

Answers to Exercises in Chapter 7 - Correlation Functions

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

Random variables. A random variable X is a function that assigns a real number, X(ζ), to each outcome ζ in the sample space of a random experiment.

Reliability of Technical Systems

Chapter 2 Basic Reliability Mathematics

Weyl sequences: Asymptotic distributions of the partition lengths

DEPARTMENT OF ELECTRICAL AND ELECTRONIC ENGINEERING EXAMINATIONS 2008

We just finished the Erdős-Stone Theorem, and ex(n, F ) (1 1/(χ(F ) 1)) ( n

Sections 2.2 & 2.3 Limit of a Function and Limit Laws

EE100 Lab 3 Experiment Guide: RC Circuits

Homework 10 (Stats 620, Winter 2017) Due Tuesday April 18, in class Questions are derived from problems in Stochastic Processes by S. Ross.

Random Processes 1/24

An Introduction to Malliavin calculus and its applications

Class Meeting # 10: Introduction to the Wave Equation

( ) ( ) if t = t. It must satisfy the identity. So, bulkiness of the unit impulse (hyper)function is equal to 1. The defining characteristic is

arxiv: v1 [math.pr] 19 Feb 2011

Sample Autocorrelations for Financial Time Series Models. Richard A. Davis Colorado State University Thomas Mikosch University of Copenhagen

8. Basic RL and RC Circuits

OBJECTIVES OF TIME SERIES ANALYSIS

Module 2 F c i k c s la l w a s o s f dif di fusi s o i n

Matlab and Python programming: how to get started

ECE 510 Lecture 4 Reliability Plotting T&T Scott Johnson Glenn Shirley

Differential Geometry: Revisiting Curvatures

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

The Strong Law of Large Numbers

Comparison between the Discrete and Continuous Time Models

Linear Response Theory: The connection between QFT and experiments

Answers to 1 Homework

Challenge Problems. DIS 203 and 210. March 6, (e 2) k. k(k + 2). k=1. f(x) = k(k + 2) = 1 x k

Elements of Stochastic Processes Lecture II Hamid R. Rabiee

Lecture Outline. Introduction Transmission Line Equations Transmission Line Wave Equations 8/10/2018. EE 4347 Applied Electromagnetics.

Transcription:

Transform Techniques A convenien way of finding he momens of a random variable is he momen generaing funcion (MGF). Oher ransform echniques are characerisic funcion, z-ransform, and Laplace ransform. Momen Generaing Funcion For a real, he MGF of he random variable is M E[ e ] e Example - Bernoulli k x k e p ( x ) discree x k e f ( x) dx coninuous The probabiliy mass funcion is p p, x ( x) p, x M e e ( p) + e p pe + p is a real variable Example - Exponenial M e e f ( x) dx x x x e e dx ( ) x ( ) e dx <

Properies of MGF ) Find momens easily from he MGF Recall 3 3 e + + + +! 3! Taking expecaion, 3 3 M e + + + +! 3! Differeniaing wih respec o, m imes, m ( m) d m M () M ( ) he m - h momen m d, ) Can show wo random variables have he same probabiliy disribuion M M f( u) f( u) If wo random variables and have he same MGF, and have he same probabiliy disribuion. 3) Convergence Consider a sequence of random variables,, wih cdf F ( x), F ( x), and heir momen generaing funcions M ( ), M ( ),. F ( x) F( x) iff M ( ) M( ) n n.

3 4) Sum of Independen RVs Le and be independen random variables wih M ( ) and M ( ) respecively. Define Z +. Proof: M M M Z M e Z Z e ( + ) e e if and are independen, e M M e 5) Ohers If a + b, hen b M e M( a).

4 Bernoulli RV p p for x ( x) p for x M e pe + pw M (), which mus be rue for any. () () M pe M () p M pe M () p In fac, M pe + p 3 p+ p + + + +! 3! 3 + p + p + p +! 3! Recall for any random variable, Therefore 3 3 M + + + +! 3! n p for n,,

5 Poisson RV k e Le be a Poisson random variable wih [ ] P k for k,,,. k! The MGF is M e ( e ) Homework. Derive he MGF of he Poisson rv. Propery - Sum of independen Poisson is Poisson When and are independen Poisson wih arrival raes and respecively, heir MGF are Thus ( e ) ( e ) and. M e M e M M M + e ( + )( e ) Eq. p shows is Poisson wih rae +. + ( p) Noe. Addiion of independen Poisson packe sreams yields a Poisson packe sream.

6 Exponenial RV ( ) Le ~ Exp. The MGF is M < Noe M 3 + + + + 3 3! 3! + + + +!! 3 3! n n! n Propery Sum of Independen Exponenial Wha is he sum of independen exponenial? Suppose M, M. M + + is no exponenial, bu becomes -Erlang when : M +

7 Propery Exponenial and Gamma Funcion Relaion beween he momens of an exponenial random variable and he gamma funcion. Le be an exponenial random variable wih. we know n which means Eq, e shows n! n n x ( ) x e dx n! e Γ ( n+ ) x e dx n! n x

8 Gaussian RV ( σ ) Le ~ N m,. he MGF is M e σ m+ Homework. Derive he MGF of he Gaussian rv. Propery - Gaussian Sum of independen Gaussian is Gaussian.* Proof. ( ) e σ σ m+ m + Le M ( ) e and M ( ) e. If and are independen, M M M. + ( σ + σ ) m + m + ( σ σ ) Eq. g shows + is Gaussian, N m + m, +. ( g) Also noe ha ( ) M e ( σ + σ ) m m + Noe In fac, any linear combinaion of joinly Gaussian random variables is Gaussian. Suppose and are joinly Gaussian random variables. For any consans a, b and c, define V a + b + c V is a Gaussian random variable. and do no have o be independen for V o be Gaussian.

9 Gamma RV Le ~ Gamma( α, ) for any > and α>, wih is pdf α ( x) x f ( x) e for x. Γ( α) The gamma funcon is defined as z x Γ ( z+ ) x e dx he MGF is Proof. M e M for <. ( ) α α α x ( x) x e e dx Γ( α) α α ( ) x (( ) x) e d x Γ( α) (( ) ) provided > α Γ( α) ( ) α α α for <. y y e dy

Propery Relaion beween Gamma and Erlang Disribuions ( ) m-erlang is a special case of Gamma α, where α is a posiive ineger m. Le m be a posiive ineger, and assume ha,,, are independen exponenial random variables wih arrival rae. m Define + + + m. is he m-h arrival ime in a Poisson arrival process wih arrival rae. is referred o as an m-erlang random variable. { j} ( ) Since is Exp, Since. j are independen, M. The MGF of is idenical o he MGF of Gamma, wih α m. ( α) Gamma( α) An m-erlang random variable is a special case of, where j α M m is a posiive ineger m.

Propery Sum of independen gamma is gamma. ( αx) ( αy) Le ~ Gamma, and ~ Gamma,. Assume and are independen. ( αx αy) + ~ Gamma, + Proof: αx αy M, M αx+ αy M+ ( αx αy) + ~ Gamma, + Example. m-erlang Suppose α and α are posiive inegers. x y For example, suppose α 3 and α 4. x is he ime when he 3rd arrival occurs, is he ime when he 4h arrival occurs, and + is he ime when he 7h arrival accurs. y

Chi-square wih degree of freedom Le ~ N(,) Define. ~ Chi. Chi - square disribuion of degree of freedom / / M <. / Homework. MGF of Chi Derive he MGF of Chi(). Propery. Chi is a special case of gamma. α M Special case of Gamma, α Find pdf of : Since Chi() Gamma,, α ( y) f ( y) e Γ( α ) y y e y > π y and α

3 Chi-square wih k degrees of freedom k ( ) k / ( ) Le,, be independen N,. Define + + +. k ~ Chi k, Chi - square disribuion wih k degrees of freedom. / k M <. special case of Gamma, α / Proof. Since j is N (, ), j is Chi( ), and hus is Gamma,. { j} { j} Since are independen, are independen. Also sum of independen Gamma is Gamma. k Therefore is Gamma,. M / / k / <. Find pdf of : k Since Chi ( k ) Gamma,, α ( y) f ( y) e Γ( α) y k and α

4 Sum of a Random Number of Random Variables Le,, be iid random variables wih mean and σ. Define a new random variables as + + + where N is a random variable wih N and σ. N ( ) N σn VAR N σ + N Proof. Le M, M, M denoe MGF of,, N. N ( + + N e ) [ ] N is a posiive ineger random variale wih P N n, n,,. The MGF of is M e ( + + n P N n e ) n n n n [ ] [ ] P N n e [ ] [ ] n P N n M P N n e Define u log M. [ ] M P N n e n e u N N n ( ) log M n un ( ) ( ) M u e

5 ( e ) Differeniaing, ( ) M MN u u u MN u ( u) ( ) log M MN u M u M ( e) ( ) ( ) ( ) ( e ) Noe ha as, u u log M log. Therefore from, which saes ( ) M MN u M u M ( ) u N Wrie ( e ) as ( e ) Differeniaing 3 wih respec o, M u M M MN ( u) e ( 3) M M ( ) ( ) M MN u u MN u + M u u M u M Recalling, M N M M M M M M M M ( u) MN ( u ) + u M u M Subsiuing, and hus u, ( ) N + N N + Nσ

6 σ σ N + N N Nσ + N N Nσ + σn Example. Queue Lengh A queue conains N packes. Each packe conains bis. σn N σ N and are random variables. Assume N, σ, and are known. Le indicae he number of bis in he queue: + + N N VAR( ) N σ +. If σ, ha is, packes are of a fixed lengh, hen N, N and VAR( ) σ.