Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review of probability theory.. Simple random walks. 3. Branching processes. 5. Renewal processes. Why random? Our life is affected by random variations, uncertainties and unexpected events. For example, the credit crunch, the bank crisis, the earthquake of Haiti, the temperature, the price of a stock, the price of oil and many other quantities are affected by random, unpredictable factors. Meanwhile, these quantities also evolve in time. It is very important to learn how to handle the random variations. This is the aim of this and many other probability courses. The pre-requist for this module are probability I and probability II. To see what kind of problems we are going to treat, let us look at two examples. Example 1.1 A gambler has a capital of 10000. He wins/loses 100 with probability 1 or at each bet. What is the probability ( the chance) that the 3 3 gambler eventually is ruined? Example 1. What is the probability that a drunk man (wondering on the street) eventually gets home safely? Example 1.3 Family tree. Suppose that the first generation starts with a couple (of rabbits). For example, we can ask the question: what is the probability that the 10th generation consists of 30 members? What is the probability that the family tree is eventually extinct? A brief review of probability theory. A probability space (Ω, F, P ) consists of three objects. Ω is the sample space made of all possible outcomes; F is the collection of events; P (A) is the probability that the event A can happen. Computation rules: 1
1. If A B, P (A) P (B).. Let A c denotes the complement of A. Then P (A c ) 1 P (A). 3. P (A B) P (A) + P (B) P (A B), where A B is the intersection of A and B (both A and B). 1.. Conditional probability The probability of A given that event B has occurred is the called the the conditional probability of A given B defined by P (A B) P (A B) P (B) Consequences. 1. From the definition, P (A B) P (A B)P (B) P (B A)P (A). The law of total probability. Let {B i } i1 be a partition of the sample space Ω, i.e., B i are disjoint and Ω i1b i. Then for any event A it holds that P (A) A i1(a B i ) P (A B i ) P (B i )P (A B i ) i1 i1 Example.1. Suppose that the probability that a man is color-blind is 0.005 and the probability that a woman is color-blind is 0.005. Assume that a population is evenly divided by men and women. (1) Find out the probability that a randomly chosen person from the population is color-blind. () Suppose that a person from the population is color-blind. What is the probability that the person is a man? Let Then, C { a person is color-blind} M { a person is a man} W { a person is a woman} P (M) 1, P (W ) 1, P (C M) 0.005, P (C W ) 0.005. (1). C (C M) (C W )
and P (C) P (M)P (C M) + P (W )P (C W ) 1 0.005 + 1 0.005 0.0038 (). P (M C) P (M C) P (C) 1 0.005 0.66. 0.0038 P (M)P (C M) P (C) 1.3. Independence If P (A B) P (A), meaning that the probability of A remains the same no matter whether B has occurred or not (i.e., A has nothing to with B), we say that A and B are independent. In this case, P (A B) P (A)P (B) Events A 1, A,..., are said to be mutually independent if for each integer k, and each choice of distinct integers n 1,..., n k, P ( k i1a ni ) Π k i1p (A ni ) Example. Toss of a coin. Let A {head in the first toss} and B {tail in the second toss}. Then A, B are independent. 1.4. Random variables A random variable X is a real-valued function of outcomes. The value of X depends on the outcome of the experiment. Example.3 Let X be the number of black balls one gets when choosing 4 balls from a box that contains 6 black balls and 4 white balls. The X is a random variable. Example: X is the number of points one gets for tossing a dice. Discrete random variables. If a random variable X can only assume countably many number of values {x 1, x,..., x n...}, X is called a discrete random variable. In this case, {f X (x r ) P (X x r ), r 1,...} is called the probability mass (distribution) function of X. Expectation (mean). If X is a discrete random variable taking values, x 1, x,..., x n.., the expectation or the mean of X is defined as µ E(X) x r P (X x r ) r1 3
Variance: The variance of X is defined as V (X) E[(X µ) ] (x r µ) P (X x r ) r1, which measures the deviation of X from its mean. Independence of random variables. Let X be a random variable taking values x 1,..., x r,.. and X a random variable taking values y 1,..., y r,.. We say that X, Y are independent if for all x i, y j P (X x i, Y y j ) P (X x i )P (Y y j ) If X, Y are independent, we can deduce that E(XY ) E(X)E(Y ) Examples of discrete distributions 1. Bernoulli distribution. Write X Bern(p). In this case, the possible values of X are 0, 1 and P (X 1) p, P (X 0) q 1 p. For example, if X is the number of heads we get when tossing a fair coin, then X Bern( 1 ).. Binomial distribution with parameter n and p. Write X Bin(n, p). X is used to denote the number of times a certain event occurs in a series of repeated trials. It holds that P (X k) ( n k ) p k (1 p) n k, E[X] kp k 0,..., n Example. X the number of heads we get when tossing a fair coin n times. X Bin(n, 1 ). 3. Geometric distribution. We say that X has a geometric distribution with parameter p if P (X n) p(1 p) n 1, n 1,, 3,... Write X G(p). In this case, E[X] 1 p. For example, the number X of tosses we need to get the first head has a geometric distribution with parameter 1. 4. We say X has a Poisson distribution with parameter λ, write X P oiss(λ), if λ λn P (X n) e, n 0, 1,... E(X) V (X) λ 4
Example: X is the number of customers who visit a particular shop in a day. Continuous type distribution Besides the discrete random variables, we also have continuous type random variables whose values could fill in an entire interval [a, b] or the whole line, e.g. the lifetime of a car. In this case, X has a probability density function f X (x) so that µ E[X] P (X z) z xf X (x)dx, V (X) f X (x)dx (x µ) f X (x)dx. Examples. 1. X U[a, b], the uniform distribution on the interval [a, b]. The probability density function is given by f X (x) µ E(X) { 1 b a a x b 0 otherwise. xf X (x)dx 1 (b + a). X Exp(λ), the exponential distribution with parameter λ. The probability density function is given by { λexp( λx) x 0 f X (x) 0 x < 0. µ E(X) xf X (x)dx 1 λ For example, X could be the lifetime of a computer.. X N(µ, σ ), the normal distribution. The probability density function is given by ( ) f X (x) (πσ ) 1 (x µ) exp σ In this case, E(X) µ, V (X) σ, e.g. X could be the measurement of a particular quantity µ. For example, X could be the measurement of a particular quantity µ. 1.5. Probability generating functions We are going to introduce a useful tool for studying random variables and associated distributions. Let X be a random variable taking non-negative 5
integer values. Let f X (n) P (X n), n 0,...,.. be the probability mass function. The probability generating function of X is defined by the following infinite sum: G X (s) f X (n)s n P (X n)s n P (X 0) + P (X 1)s + P (X )s +... + P (X n)s n +... Example.4 Let X be a Bernoulli random variable, i.e., P (X 1) p, P (X 0) 1 p q. Find the probability generating function of X. G X (s) P (X 0) + P (X 1)s + P (X )s +... q + ps Example.5 X Bin(n, p). Find G X (s). G X (s) P (X 0) + P (X 1)s + P (X )s +... n P (X k)s k k0 n k0 ( n k (ps + (1 p)) n Example.6 X P oiss(5). Find G X (s). ) p k (1 p) n k s k G X (s) P (X 0) + P (X 1)s + P (X )s +... P (X n)s n 5 (5s)n e k0 e 5s 5 5 5n e sn Example.7 Let X be a random variable with probability mass function P (X 1) 1, P (X 3) 1 and P (X 4) 3. Find the probability 5 5 5 generating function of X. 6
G X (s) P (X 0) + P (X 1)s + P (X )s +... P (X 1)s + P (X 3)s 3 + P (X 4)s 4 1 5 s + 1 5 s3 + 3 5 s4 Properties. (1). Equivalent expression for G X (s): For any fixed s, write h(x) s X. Then G X (s) E[s X ] E[h(X)] h(n)p (X n) (). s n P (X n) G X (1) P (X n)1 n P (X n) 1 G X (0) P (X 0) (3). Hence, G X(s) ( f X (n)s n ) G X(1) f X (n)ns n 1 nf X (n) E(X) e.g. X Bin(n, p). G(s) (ps + 1 p) n and G (1) np E[X]. (4). Similarly, we have, If V ar(x) G X(1) + G X(1) (G X(1)) (5). The generating function determines the probability mass function. G X1 (s) P (X 1 n)s n G X (s) then matching the coefficients we have P (X 1 n)s n P (X 1 n) P (X n), n 0, 1,,... (6). X 1 and X are independent if and only if G X1 +X (s) G X1 (s)g X (s). Let us prove it for one direction. Assume X 1 and X are independent. Then using the alternative expression, G X1 +X (s) E[s X 1+X ] E[s X 1.s X ] E[s X 1 ]E[s X ] G X1 (s)g X (s) 7
Example.8 Suppose that the probability generating function of X is given by G X (s) 1 8 + 1 4 s + 3 8 s3 + 1 4 s4. Write down the probability mass function of X. Since P (X n) is the coefficient to s n by the definition of the generating function, we have P (X 0) 1 8, P (X 1) 0, P (X ) 1 4, P (X 3) 3 8 and P (X 4) 1 4. Example.9 Suppose that the probability generating functions of independent random variables X 1 and X are given by G X1 (s) 1 3 + 3 s, G X (s) 5 s + 3 5 s4. Let S X 1 + X. Find (1). G S (s). (). The probability mass function of S. (1). G S (s) G X1 (s)g X (s) ( 1 3 + 3 s )( 5 s + 3 5 s4 ) 15 s + 1 5 s4 + 4 15 s4 + 5 s6. (). P (S ) 15, P (S 4) 7 15, P (S 6) 5. Example.10 Suppose that the probability generating functions of X is G X (s) s e s Find the probability mass function (distribution) of X. Write G X (s) in power series as follows: We conclude that G X (s) s e s s e e s s e e s (s)n P (X n + ) e n n e sn+, n 0, 1,... In general, if G X (s) e λs λ, then X P oiss(λ). (s) n Example.11 If X 1 P oiss(λ 1 ), X P oiss(λ ) and X 1 X are independent, show X 1 + X P oiss(λ 1 + λ ). 8
We know that G X1 (s) e λ 1s λ 1 and G X (s) e λ s λ. Since X 1 and X are independent, we have G X1 +X (s) G X1 (s)g X (s) e (λ 1+λ )s (λ 1 +λ ) Therefore, X 1 + X P oiss(λ 1 + λ ). Example.1 (1). Let G(s) a 0 + a 1 s + a s +... + a n s n be a polynomial with a i 0, i 0. Write down the necessary and sufficient condition under which G is a probability generating function of some random variable X. (). Suppose that a random variable X has generating function G X (s) ps, 0 < p < 1, q 1 p. 1 qs Determine the probability mass function of X and E(X). (1). The condition is a 0 + a 1 +... + a n 1. (). Using 1 1 x xn, we obtain that G X (s) ps 1 qs ps[ (qs) n ] pq n s n+1 This implies that P (X n + 1) pq n, n 0. A simple computation gives Hence, E(X) G X (1) G X(s) p (1 q1) 1 p. p (1 qs) Lemma.13 If X has the p.g.f G X (s), then the p.g.f. of Y a + bx is given by G Y (s) s a G X (s b ). Proof. G Y (s) E[s Y ] E[s a+bx ] E[s a s bx ] s a E[(s b ) X ] s a G X (s b ). Example.14 Assume P (X n) 3 ( 1 3 )n 1, n 1. (1). Find G X (s). (). Write down the generating function of Y + 5X. 9