Probability Generating Functions

Similar documents
Markov Chains. Andreas Klappenecker by Andreas Klappenecker. All rights reserved. Texas A&M University

Random Variables. Andreas Klappenecker. Texas A&M University

Area of left square = Area of right square c 2 + (4 Area of (a, b)-triangle) = a 2 + b 2 + (4 Area of (a, b)-triangle) c 2 = a 2 + b 2.

Asymptotic Analysis 1: Limits and Asymptotic Equality

Generating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function

Discrete Distributions Chapter 6

LOWELL WEEKLY JOURNAL.

HEAGAN & CO., OPP. f>, L. & W. DEPOT, DOYER, N. J, OUR MOTTO! ould Iwv ia immediate vltlui. VEEY BEST NEW Creamery Butter 22c ib,

Lecture 19 - Covariance, Conditioning

Stat 134 Fall 2011: Notes on generating functions

MTH 224: Probability and Statistics Fall 2017

Random Variable. Pr(X = a) = Pr(s)

The Riemann Roch theorem for metric graphs

Factorization in Integral Domains II

Notes 6 : First and second moment methods

CSC 344 Algorithms and Complexity. Proof by Mathematical Induction

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

DS-GA 1002: PREREQUISITES REVIEW SOLUTIONS VLADIMIR KOBZAR

INDUCTION AND RECURSION. Lecture 7 - Ch. 4

MTH 224: Probability and Statistics Fall 2017

CSCE 222 Discrete Structures for Computing. Dr. Hyunyoung Lee

Basics of Probability Theory

The Minimal Rank Problem for Finite Fields. Wayne Barrett (BYU) Hein van der Holst (Eindhoven) Raphael Loewy (Technion) Don March (BYU)

APPROXIMATE HOMOMORPHISMS BETWEEN THE BOOLEAN CUBE AND GROUPS OF PRIME ORDER

Definitions, Theorems and Exercises. Abstract Algebra Math 332. Ethan D. Bloch

Probability and Statistics

1 (1 x) 4 by taking derivatives and rescaling appropriately. Conjecture what the general formula for the series for 1/(1 x) n.

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

arxiv: v1 [math.nt] 23 May 2017

3. DISCRETE RANDOM VARIABLES

System Identification

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

The Class Equation X = Gx. x X/G

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Proof by Induction. Andreas Klappenecker

ADVANCE TOPICS IN ANALYSIS - REAL. 8 September September 2011

THE QUEEN S UNIVERSITY OF BELFAST

arxiv: v2 [math.co] 11 Oct 2016

4 Sums of Independent Random Variables

Slides 8: Statistical Models in Simulation

1 Inverse Transform Method and some alternative algorithms

Sobolev-Type Extremal Polynomials

Mathematical Finance

p. 4-1 Random Variables

4 Branching Processes

Probability Theory Overview and Analysis of Randomized Algorithms

MATHEMATICS AND STATISTICS

Sequences and Series of Functions

Gap probabilities in tiling models and Painlevé equations

arxiv: v6 [math.nt] 6 Nov 2015

Foundation of Cryptography, Lecture 4 Pseudorandom Functions

HUB HUB RESEARCH PAPER. A Markov Binomial Distribution. Hogeschool Universiteit Brussel. E. Omey, J. Santos and S. Van Gulck

3 Multiple Discrete Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Characterizing Cycle Partition in 2-Row Bulgarian Solitaire

arxiv: v6 [math.nt] 16 May 2017

Discrete Random Variables

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Mathematical Induction. How does discrete math help us. How does discrete math help (CS160)? How does discrete math help (CS161)?

Some Congruences for the Partial Bell Polynomials

Gaussian vectors and central limit theorem

For example, p12q p2x 1 x 2 ` 5x 2 x 2 3 q 2x 2 x 1 ` 5x 1 x 2 3. (a) Let p 12x 5 1x 7 2x 4 18x 6 2x 3 ` 11x 1 x 2 x 3 x 4,

Math Bootcamp 2012 Miscellaneous

LOWELL WEEKLY JOURNAL.

arxiv:alg-geom/ v2 30 Jul 1996

NOTES WEEK 15 DAY 1 SCOT ADAMS

Exam 2 Solutions. In class questions

1 Hypothesis testing for a single mean

Convergence of Infinite Composition of Entire Functions

Discrete Random Variables (cont.) Discrete Distributions the Geometric pmf

1 Presessional Probability

Inner Product Spaces

PRINCIPLES OF ANALYSIS - LECTURE NOTES

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

Math 651 Introduction to Numerical Analysis I Fall SOLUTIONS: Homework Set 1

MULTI-RESTRAINED STIRLING NUMBERS

Genealogy of Pythagorean triangles

Chapter 2. Real Numbers. 1. Rational Numbers

Some Results Concerning Uniqueness of Triangle Sequences

Universal examples. Chapter The Bernoulli process

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

The Union and Intersection for Different Configurations of Two Events Mutually Exclusive vs Independency of Events

SDS 321: Introduction to Probability and Statistics

8. Prime Factorization and Primary Decompositions

Sobolev Spaces. Chapter 10

ADVANCED PROBLEMS AND SOLUTIONS

MATH 387 ASSIGNMENT 2

Modern Discrete Probability Branching processes

CS 173 Lecture 7: Arithmetic (II)

Runs of Identical Outcomes in a Sequence of Bernoulli Trials

Topic 3: The Expectation of a Random Variable

EECS 1028 M: Discrete Mathematics for Engineers

Combinatorial Analysis of the Geometric Series

6 CARDINALITY OF SETS

MAIN THEOREM OF GALOIS THEORY

Asymptotic Statistics-III. Changliang Zou

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Exam 1 Solutions. Solution: The 16 contributes 5 to the total and contributes 2. All totaled, there are 5 ˆ 2 10 abelian groups.

LEGENDRE POLYNOMIALS AND APPLICATIONS. We construct Legendre polynomials and apply them to solve Dirichlet problems in spherical coordinates.

Extensive Form Abstract Economies and Generalized Perfect Recall

Transcription:

Probability Generating Functions Andreas Klappenecker Texas A&M University 2018 by Andreas Klappenecker. All rights reserved. 1 / 27

Probability Generating Functions Definition Let X be a discrete random variable defined on a probability space with probability measure Pr. Assume that X has non-negative integer values. The probability generating function of X is defined by G X pzq Erz X s PrrX ksz k. k 0 This series converges for all z with z ď 1. 2 / 27

Expected Value Expectation The expectation value can be expressed by ErX s k PrrX ks GX 1 p1q, (1) where G 1 X pzq denotes the derivative of G X pzq. Indeed, G 1 X pzq k PrrX ksz k 1 k 0 k PrrX ksz k 1. 3 / 27

Second Moment Second Moment ErX 2 s G 2 X p1q ` G 1 X p1q Indeed, GX 1 pzq k PrrX ksz k 1 and GX 2 pzq kpk 1q PrrX ksz k 2 pk 2 kq PrrX ksz k 1. k 2 k 2 4 / 27

Variance Variance VarrX s ErX 2 s ErX s 2 G 2 X p1q ` G 1 X p1q G 1 X p1q 2. 5 / 27

Bernoulli Variables Example Let X be a random variable that has Bernoulli distribution with parameter p. The probability generating function is given by G X pzq p1 pq ` pz. Hence G 1 X pzq p, and G 2 pzq 0. We obtain ErX s G 1 X p1q p and VarrX s G 2 X p1q ` G 1 X p1q G 1 X p1q 2 0 ` p p 2 pp1 pq. 6 / 27

Geometric Random Variables Example The probability generating function of a geometrically distributed random variable X is Gpzq pp1 pq k 1 z k pz Some calculus shows that G 1 pzq p1 pq k z k k 0 p p1 p1 pqzq 2, G 2 pzq pz 1 p1 pqz. 2pp1 pq p1 p1 pqzq 3. Therefore, the expectation value is ErX s GX 1 p1q 1{p. The variance is given by VarrX s G 2 p1q ` G 1 p1q G 1 p1q 2 2p1 pq p 2 ` 1 p 1 p 2 1 p p 2. 7 / 27

Sums of Independent Random Variables Proposition Let X 1,..., X n be independent Z ě-valued random variables with probability generating functions G X1 pzq,..., G Xn pzq. The probability generating function of X X 1 ` ` X n is given by the product G X pzq nź G Xk pzq. 8 / 27

Proof. It suffices to show this for two random variables X and Y. The general case can be established by a straightforward induction proof. k k G X pzqg Y pzq PrrX ksz PrrY ksz k 0 k 0 PrrX ls PrrY k ls kÿ z k k 0 l 0 PrrX l, Y k ls kÿ z k k 0 l 0 kÿ PrrX ` Y ksz k G X `Y pzq k 0 l 0 9 / 27

Binomial Distribution Example Recall that the Bernoulli distribution with parameter p has generating function p1 pq ` pz. If X 1,..., X n are independent random variables that are Bernoulli distributed with parameter p, then X X 1 ` ` X n is, by definition, binomially distributed with parameters n and p. The generating function of X is G X pzq pp1 pq ` pzq n nÿ k 0 ˆn k p1 pq n k p k z k. 10 / 27

Binomial Distribution Example (Continued.) We have G 1 X pzq nppp1 pq ` pzq n 1 The expected value is given by ErX s G 1 X p1q np. 11 / 27

Binomial Distribution Example (Continued.) We have G 1 X pzq nppp1 pq ` pzq n 1 and G 2 X pzq npn 1qp 2 pp1 pq ` pzq n 2. The expected value is given by VarrX s GX 2 p1q ` GX 1 p1q GX 1 p1q 2 pn 2 nqp 2 ` np n 2 p 2 np 2 ` np npp1 pq 12 / 27

Uniqueness Theorem Proposition Let X and Y be discrete random variables with probability generating functions G X pzq and G Y pzq, respectively. Then the probability generating function G X pzq G Y pzq if and only if the probability distributions for all integers k ě 0. PrrX ks PrrY ks 13 / 27

Proof. If the probability distributions are the same, then evidently G X pzq G Y pzq. Conversely, suppose that the generating functions G X pzq and G Y pzq are the same. Since the radius of convergence is at least 1, we can expand the two generating funcions into power series G X pzq G Y pzq PrrX ksz k k 0 PrrY ksz k k 0 These two power series must have identical coefficients, since the generating functions are the same. Therefore, PrrX ks PrrY ks for all k ě 0, as claimed. 14 / 27

Number of Inversions 15 / 27

Inversion of a Permutation Definition Let pa 1, a 2,..., a n q be a permutation of the set t1, 2,..., nu. The pair pa i, a j q is called an inversion if and only if i ă j and a i ą a j. Example The permutation p3, 4, 1, 2q has the inversions tp3, 1q, p3, 2q, p4, 1q, p4, 2qu. 16 / 27

Number of Inversions Definition Let I n pkq denote the number of permutations on n points with k inversions. Example We have I n p0q 1, since only the identity has no inversions. Example We have I n p1q n 1. Indeed, a permutation π can have a single inversion if and only if π is equal to a transposition of neighboring elements pk ` 1, kq for some k in the range 1 ď k ď n 1. 17 / 27

Number of Inversions Example Since no permutation can have more than `n 2 inversions, we have ˆn I n pkq 0 for all k ą. 2 Example By reversal of the permutations, we have the symmetry ˆˆn I n k I n pkq 2 18 / 27

Probability Generating Function Suppose that we choose permutations π uniformly at random from the symmetric group S n. Let X n denote the random variable on S n that assigns a permutation π its number of inversions. Then the probability generating function is given by G Xn pzq p n 2q ÿ k 0 G Xn pzq PrrX n ksz k p n 2q ÿ k I n pkq z k. n! 19 / 27

Probability Generating Function Question Can we relate the generating functions G Xn pzq and G Xn 1 pzq? 20 / 27

Probability Generating Function Question Can we relate the generating functions G Xn pzq and G Xn 1 pzq? Observation Suppose that we have a permutation π n 1 on t1, 2,..., n 1u. If we insert the element n at position j with 1 ď j ď n, then we get an additional n j inversions. Example 1 2 8 3 4 5 6 7 20 / 27

Probability Generating Function Example 1 2 3 4 5 6 7 8 additional inversions: 0 1 2 3 4 5 6 8 7 additional inversions: 1. 8 1 2 3 4 5 6 7 additional inversions: 7 Observation We need to insert n uniformly at random to obtain a uniformly distributed permutation on n elements from uniformly distributed permutations on n 1 elements. 21 / 27

Probability Generating Function Proposition $ & p1 ` z ` z 2 ` ` z n 1 q G G Xn pzq n Xn 1 pzq if n ą 1, % 1 if n 1. 22 / 27

Probability Generating Function Corollary G Xn pzq 1 n! nź nź p1 ` z ` z 2 ` ` z n 1 q 1 z k kp1 zq 1 n! nź 1 z k 1 z 23 / 27

Factorization: Expected Value In other words, the generating function G Xn pzq is the product of generating functions of discrete uniform random variables U k on t0, 1,..., k 1u, G Xn pzq nź G Uk pzq, where G Uk pzq 1 ` z ` ` z k 1. k By the product rule for n functions, we have nź nÿ GU 1 n pzq G Uk pzq l 1 1 p1 ` 2z ` 3z 2 ` ` ` pl 1qz l 2 q n. G Ul pzq Then ErX n s G 1 U n p1q nÿ k 1 2 npn 1q. 4 24 / 27

Expected Value (Alternative Way) Example (Creating a Permutation by Inserting One Element at a Time) 1 2 1 2 1 3.... 8 2 1 3 5 7 6 8 Observation X n U 1 ` U 2 ` ` U n ErX n s ErU 1 s ` ErU 2 s ` ` ErU n s 25 / 27

Expected Value Proposition ErX n s nÿ ErU k s nÿ k 1 2 npn 1q. 4 26 / 27

Variance Proposition VarrX n s 2n3 ` 3n 2 5n. 72 Proof. Since X n U 1 ` U 2 ` ` U n and the U k are mutually independent, we get nÿ nÿ k 2 nÿ 1 VarrX n s VarrU n s 1 k 2 n 12 12 1 ˆ2n3 ` 3n 2 ` n n 12 6 2n3 ` 3n 2 5n. 72 27 / 27