Random Models. Tusheng Zhang. February 14, 2013

Similar documents
IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

1 Presessional Probability

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Week 12-13: Discrete Probability

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Lecture 4: Probability and Discrete Random Variables

Probability and Statistics Concepts

Analysis of Engineering and Scientific Data. Semester

Part (A): Review of Probability [Statistics I revision]

Discrete Distributions

Refresher on Discrete Probability

Things to remember when learning probability distributions:

Lecture 2: Repetition of probability theory and statistics

1 Review of Probability

Martingale Theory for Finance

2. AXIOMATIC PROBABILITY

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

1 Generating functions

Guidelines for Solving Probability Problems

1.1 Review of Probability Theory

Probability, Random Processes and Inference

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Math Bootcamp 2012 Miscellaneous

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

THE QUEEN S UNIVERSITY OF BELFAST

Probability Theory and Statistics. Peter Jochumzen

PRACTICE PROBLEMS FOR EXAM 2

Special distributions

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 4 : Discrete Random Variables

1: PROBABILITY REVIEW

Northwestern University Department of Electrical Engineering and Computer Science

Properties of Probability

2.3 Conditional Probability

S n = x + X 1 + X X n.

STAT/MATH 395 PROBABILITY II

Lecture 16. Lectures 1-15 Review

Conditional Probability

1 Random Variable: Topics

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ST 371 (V): Families of Discrete Distributions

Fundamental Tools - Probability Theory II

Chapter 2: Random Variables

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Set Theory Digression

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Expected Value 7/7/2006

Elementary Probability. Exam Number 38119

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Random Variables Example:

1 Basic continuous random variable problems

CMPSCI 240: Reasoning Under Uncertainty

Classical Probability

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Midterm Exam 1 Solution

Random Variables and Their Distributions

Lecture 3. Discrete Random Variables

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Algorithms for Uncertainty Quantification

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Recitation 2: Probability

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Overview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis

CS 630 Basic Probability and Information Theory. Tim Campbell

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9

Discrete Distributions

Chapter 1: Revie of Calculus and Probability

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Brief Review of Probability

Quick review on Discrete Random Variables

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

MFM Practitioner Module: Quantitative Risk Management. John Dodson. September 23, 2015

Chapter 1 (Basic Probability)

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Theory of probability and mathematical statistics

What is the probability of getting a heads when flipping a coin

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

5. Conditional Distributions

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

STA Module 4 Probability Concepts. Rev.F08 1

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Chapter 3. Chapter 3 sections

Review of Probability Theory

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Sample Spaces, Random Variables

Probability Review. Gonzalo Mateos

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Transcription:

Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review of probability theory.. Simple random walks. 3. Branching processes. 5. Renewal processes. Why random? Our life is affected by random variations, uncertainties and unexpected events. For example, the credit crunch, the bank crisis, the earthquake of Haiti, the temperature, the price of a stock, the price of oil and many other quantities are affected by random, unpredictable factors. Meanwhile, these quantities also evolve in time. It is very important to learn how to handle the random variations. This is the aim of this and many other probability courses. The pre-requist for this module are probability I and probability II. To see what kind of problems we are going to treat, let us look at two examples. Example 1.1 A gambler has a capital of 10000. He wins/loses 100 with probability 1 or at each bet. What is the probability ( the chance) that the 3 3 gambler eventually is ruined? Example 1. What is the probability that a drunk man (wondering on the street) eventually gets home safely? Example 1.3 Family tree. Suppose that the first generation starts with a couple (of rabbits). For example, we can ask the question: what is the probability that the 10th generation consists of 30 members? What is the probability that the family tree is eventually extinct? A brief review of probability theory. A probability space (Ω, F, P ) consists of three objects. Ω is the sample space made of all possible outcomes; F is the collection of events; P (A) is the probability that the event A can happen. Computation rules: 1

1. If A B, P (A) P (B).. Let A c denotes the complement of A. Then P (A c ) 1 P (A). 3. P (A B) P (A) + P (B) P (A B), where A B is the intersection of A and B (both A and B). 1.. Conditional probability The probability of A given that event B has occurred is the called the the conditional probability of A given B defined by P (A B) P (A B) P (B) Consequences. 1. From the definition, P (A B) P (A B)P (B) P (B A)P (A). The law of total probability. Let {B i } i1 be a partition of the sample space Ω, i.e., B i are disjoint and Ω i1b i. Then for any event A it holds that P (A) A i1(a B i ) P (A B i ) P (B i )P (A B i ) i1 i1 Example.1. Suppose that the probability that a man is color-blind is 0.005 and the probability that a woman is color-blind is 0.005. Assume that a population is evenly divided by men and women. (1) Find out the probability that a randomly chosen person from the population is color-blind. () Suppose that a person from the population is color-blind. What is the probability that the person is a man? Let Then, C { a person is color-blind} M { a person is a man} W { a person is a woman} P (M) 1, P (W ) 1, P (C M) 0.005, P (C W ) 0.005. (1). C (C M) (C W )

and P (C) P (M)P (C M) + P (W )P (C W ) 1 0.005 + 1 0.005 0.0038 (). P (M C) P (M C) P (C) 1 0.005 0.66. 0.0038 P (M)P (C M) P (C) 1.3. Independence If P (A B) P (A), meaning that the probability of A remains the same no matter whether B has occurred or not (i.e., A has nothing to with B), we say that A and B are independent. In this case, P (A B) P (A)P (B) Events A 1, A,..., are said to be mutually independent if for each integer k, and each choice of distinct integers n 1,..., n k, P ( k i1a ni ) Π k i1p (A ni ) Example. Toss of a coin. Let A {head in the first toss} and B {tail in the second toss}. Then A, B are independent. 1.4. Random variables A random variable X is a real-valued function of outcomes. The value of X depends on the outcome of the experiment. Example.3 Let X be the number of black balls one gets when choosing 4 balls from a box that contains 6 black balls and 4 white balls. The X is a random variable. Example: X is the number of points one gets for tossing a dice. Discrete random variables. If a random variable X can only assume countably many number of values {x 1, x,..., x n...}, X is called a discrete random variable. In this case, {f X (x r ) P (X x r ), r 1,...} is called the probability mass (distribution) function of X. Expectation (mean). If X is a discrete random variable taking values, x 1, x,..., x n.., the expectation or the mean of X is defined as µ E(X) x r P (X x r ) r1 3

Variance: The variance of X is defined as V (X) E[(X µ) ] (x r µ) P (X x r ) r1, which measures the deviation of X from its mean. Independence of random variables. Let X be a random variable taking values x 1,..., x r,.. and X a random variable taking values y 1,..., y r,.. We say that X, Y are independent if for all x i, y j P (X x i, Y y j ) P (X x i )P (Y y j ) If X, Y are independent, we can deduce that E(XY ) E(X)E(Y ) Examples of discrete distributions 1. Bernoulli distribution. Write X Bern(p). In this case, the possible values of X are 0, 1 and P (X 1) p, P (X 0) q 1 p. For example, if X is the number of heads we get when tossing a fair coin, then X Bern( 1 ).. Binomial distribution with parameter n and p. Write X Bin(n, p). X is used to denote the number of times a certain event occurs in a series of repeated trials. It holds that P (X k) ( n k ) p k (1 p) n k, E[X] kp k 0,..., n Example. X the number of heads we get when tossing a fair coin n times. X Bin(n, 1 ). 3. Geometric distribution. We say that X has a geometric distribution with parameter p if P (X n) p(1 p) n 1, n 1,, 3,... Write X G(p). In this case, E[X] 1 p. For example, the number X of tosses we need to get the first head has a geometric distribution with parameter 1. 4. We say X has a Poisson distribution with parameter λ, write X P oiss(λ), if λ λn P (X n) e, n 0, 1,... E(X) V (X) λ 4

Example: X is the number of customers who visit a particular shop in a day. Continuous type distribution Besides the discrete random variables, we also have continuous type random variables whose values could fill in an entire interval [a, b] or the whole line, e.g. the lifetime of a car. In this case, X has a probability density function f X (x) so that µ E[X] P (X z) z xf X (x)dx, V (X) f X (x)dx (x µ) f X (x)dx. Examples. 1. X U[a, b], the uniform distribution on the interval [a, b]. The probability density function is given by f X (x) µ E(X) { 1 b a a x b 0 otherwise. xf X (x)dx 1 (b + a). X Exp(λ), the exponential distribution with parameter λ. The probability density function is given by { λexp( λx) x 0 f X (x) 0 x < 0. µ E(X) xf X (x)dx 1 λ For example, X could be the lifetime of a computer.. X N(µ, σ ), the normal distribution. The probability density function is given by ( ) f X (x) (πσ ) 1 (x µ) exp σ In this case, E(X) µ, V (X) σ, e.g. X could be the measurement of a particular quantity µ. For example, X could be the measurement of a particular quantity µ. 1.5. Probability generating functions We are going to introduce a useful tool for studying random variables and associated distributions. Let X be a random variable taking non-negative 5

integer values. Let f X (n) P (X n), n 0,...,.. be the probability mass function. The probability generating function of X is defined by the following infinite sum: G X (s) f X (n)s n P (X n)s n P (X 0) + P (X 1)s + P (X )s +... + P (X n)s n +... Example.4 Let X be a Bernoulli random variable, i.e., P (X 1) p, P (X 0) 1 p q. Find the probability generating function of X. G X (s) P (X 0) + P (X 1)s + P (X )s +... q + ps Example.5 X Bin(n, p). Find G X (s). G X (s) P (X 0) + P (X 1)s + P (X )s +... n P (X k)s k k0 n k0 ( n k (ps + (1 p)) n Example.6 X P oiss(5). Find G X (s). ) p k (1 p) n k s k G X (s) P (X 0) + P (X 1)s + P (X )s +... P (X n)s n 5 (5s)n e k0 e 5s 5 5 5n e sn Example.7 Let X be a random variable with probability mass function P (X 1) 1, P (X 3) 1 and P (X 4) 3. Find the probability 5 5 5 generating function of X. 6

G X (s) P (X 0) + P (X 1)s + P (X )s +... P (X 1)s + P (X 3)s 3 + P (X 4)s 4 1 5 s + 1 5 s3 + 3 5 s4 Properties. (1). Equivalent expression for G X (s): For any fixed s, write h(x) s X. Then G X (s) E[s X ] E[h(X)] h(n)p (X n) (). s n P (X n) G X (1) P (X n)1 n P (X n) 1 G X (0) P (X 0) (3). Hence, G X(s) ( f X (n)s n ) G X(1) f X (n)ns n 1 nf X (n) E(X) e.g. X Bin(n, p). G(s) (ps + 1 p) n and G (1) np E[X]. (4). Similarly, we have, If V ar(x) G X(1) + G X(1) (G X(1)) (5). The generating function determines the probability mass function. G X1 (s) P (X 1 n)s n G X (s) then matching the coefficients we have P (X 1 n)s n P (X 1 n) P (X n), n 0, 1,,... (6). X 1 and X are independent if and only if G X1 +X (s) G X1 (s)g X (s). Let us prove it for one direction. Assume X 1 and X are independent. Then using the alternative expression, G X1 +X (s) E[s X 1+X ] E[s X 1.s X ] E[s X 1 ]E[s X ] G X1 (s)g X (s) 7

Example.8 Suppose that the probability generating function of X is given by G X (s) 1 8 + 1 4 s + 3 8 s3 + 1 4 s4. Write down the probability mass function of X. Since P (X n) is the coefficient to s n by the definition of the generating function, we have P (X 0) 1 8, P (X 1) 0, P (X ) 1 4, P (X 3) 3 8 and P (X 4) 1 4. Example.9 Suppose that the probability generating functions of independent random variables X 1 and X are given by G X1 (s) 1 3 + 3 s, G X (s) 5 s + 3 5 s4. Let S X 1 + X. Find (1). G S (s). (). The probability mass function of S. (1). G S (s) G X1 (s)g X (s) ( 1 3 + 3 s )( 5 s + 3 5 s4 ) 15 s + 1 5 s4 + 4 15 s4 + 5 s6. (). P (S ) 15, P (S 4) 7 15, P (S 6) 5. Example.10 Suppose that the probability generating functions of X is G X (s) s e s Find the probability mass function (distribution) of X. Write G X (s) in power series as follows: We conclude that G X (s) s e s s e e s s e e s (s)n P (X n + ) e n n e sn+, n 0, 1,... In general, if G X (s) e λs λ, then X P oiss(λ). (s) n Example.11 If X 1 P oiss(λ 1 ), X P oiss(λ ) and X 1 X are independent, show X 1 + X P oiss(λ 1 + λ ). 8

We know that G X1 (s) e λ 1s λ 1 and G X (s) e λ s λ. Since X 1 and X are independent, we have G X1 +X (s) G X1 (s)g X (s) e (λ 1+λ )s (λ 1 +λ ) Therefore, X 1 + X P oiss(λ 1 + λ ). Example.1 (1). Let G(s) a 0 + a 1 s + a s +... + a n s n be a polynomial with a i 0, i 0. Write down the necessary and sufficient condition under which G is a probability generating function of some random variable X. (). Suppose that a random variable X has generating function G X (s) ps, 0 < p < 1, q 1 p. 1 qs Determine the probability mass function of X and E(X). (1). The condition is a 0 + a 1 +... + a n 1. (). Using 1 1 x xn, we obtain that G X (s) ps 1 qs ps[ (qs) n ] pq n s n+1 This implies that P (X n + 1) pq n, n 0. A simple computation gives Hence, E(X) G X (1) G X(s) p (1 q1) 1 p. p (1 qs) Lemma.13 If X has the p.g.f G X (s), then the p.g.f. of Y a + bx is given by G Y (s) s a G X (s b ). Proof. G Y (s) E[s Y ] E[s a+bx ] E[s a s bx ] s a E[(s b ) X ] s a G X (s b ). Example.14 Assume P (X n) 3 ( 1 3 )n 1, n 1. (1). Find G X (s). (). Write down the generating function of Y + 5X. 9