Lecture 4: Bernoulli Process

Similar documents
Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

EE126: Probability and Random Processes

Lecture 2: Convergence of Random Variables

Part I Stochastic variables and Markov chains

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Chapter 2 Queueing Theory and Simulation

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Queueing Theory and Simulation. Introduction

EE126: Probability and Random Processes

Discrete Random Variables (cont.) Discrete Distributions the Geometric pmf

BINOMIAL DISTRIBUTION

Lecture 3. Discrete Random Variables

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Random Variable

Mathematical Statistics 1 Math A 6330

Discrete Random Variables

Lecture 8 : The Geometric Distribution

Continuous-time Markov Chains

The exponential distribution and the Poisson process

Random Variable. Pr(X = a) = Pr(s)

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Slides 8: Statistical Models in Simulation

Northwestern University Department of Electrical Engineering and Computer Science

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Stochastic process. X, a series of random variables indexed by t

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Discrete Random Variable

Continuous-Valued Probability Review

Figure 10.1: Recording when the event E occurs

Tom Salisbury

Computer Systems Modelling

DS-GA 1002 Lecture notes 2 Fall Random variables

~ Athena Scientific, Belmont, Massachusetts. Introduction to Probability SECOND EDITION. Dimitri P. Bertsekas and John N.

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

CMPSCI 240: Reasoning Under Uncertainty

CS 237: Probability in Computing

LECTURE #6 BIRTH-DEATH PROCESS

Modeling Rare Events

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions

CS418 Operating Systems

Relationship between probability set function and random variable - 2 -

Common Discrete Distributions

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

SDS 321: Introduction to Probability and Statistics

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Quick review on Discrete Random Variables

Lecture 2: Discrete Probability Distributions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Math/Stat 352 Lecture 8

Queueing Theory. VK Room: M Last updated: October 17, 2013.

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

Answers to selected exercises

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Stochastic Models in Computer Science A Tutorial

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

II. The Binomial Distribution

SDS 321: Introduction to Probability and Statistics

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Known probability distributions

Probability and Stochastic Processes Homework Chapter 12 Solutions

STAT 430/510: Lecture 15

Lecture 12. Poisson random variables

Topic 3: The Expectation of a Random Variable

Guidelines for Solving Probability Problems

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Basic concepts of probability theory

IOE 202: lectures 11 and 12 outline

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Probability, Random Processes and Inference

Discrete Random Variables

Probability and Statistics for Data Science. Carlos Fernandez-Granda

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Glossary availability cellular manufacturing closed queueing network coefficient of variation (CV) conditional probability CONWIP

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Data analysis and stochastic modeling

Chapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.

Exam 3, Math Fall 2016 October 19, 2016

CHAPTER 2. Solution to Problem 2.1. the weekend. We have. Let X be the number of points the MIT team earns over

ECE 313 Probability with Engineering Applications Fall 2000

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Stat 100a, Introduction to Probability.

POISSON RANDOM VARIABLES

Notes on probability : Exercise problems, sections (1-7)

Random Variables Example:

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Introduction to Probability

Chapter 2 Random Variables

Notes on Continuous Random Variables

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Exponential Distribution and Poisson Process

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Computer Networks More general queuing systems

Assignment 3 with Reference Solutions

Chapter 3: Random Variables 1

Transcription:

Lecture 4: Bernoulli Process Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 4 Hyang-Won Lee 1 / 12

Stochastic Process A stochastic process is a mathematical model of a probabilistic experiment that evolves in time and generates a sequence of numerical values. In other words, a stochastic process is a sequence of random variables that are defined on the same probabilistic model (Ω, P). The following are examples that can be modelled using a stochastic process: The sequence of hourly customer arrivals at a bank The sequence of hourly traffic loads at a link of a communication network The sequence of hourly locations of a mobile phone user The sequence of the numbers of phone calls to a customer service The sequence of bus arrival times at a station. We are often interested in analyzing the behavior of a stochastic process. For example, for a stochastic process modeling hourly customer arrivals at a bank, what is the probability that within a time duration, a certain number of customers come to the bank? What is the long-term average of customer arrivals? These questions can be answered analyzing the properties of the stochastic process. Lecture 4 Hyang-Won Lee 2 / 12

Bernoulli Process Our first stochastic process is the Bernoulli process which is one of the simplest stochastic processes but may be a good starting point to understand more complex stochastic processes. Bernoulli Process The Bernoulli process is a sequence X 1, X 2,... such that X i s are independent Bernoulli random variables P(X i = 1) = p and P(X i = 0) = 1 p for each i. The independence assumption in the Bernoulli process has important consequences. For instance, two collections {X 1, X 2,...X 10 } and {X 11, X 12,...X 20 } of segments of the Bernoulli process are independent of each other. Further, suppose that the Bernoulli process has been running for n time steps. Then, the sequence of future trials X n+1,... are independent of the past results of trials X 1,..., X n. Lecture 4 Hyang-Won Lee 3 / 12

Memorylessness Another consequence of independent assumption is that the Bernoulli process is memoryless in the following sense. Let T be the time until the first success. We already know that T is a geometric random variable with P(T = k) = (1 p) k 1 p. Suppose now that we have been watching the process for n times steps but there has been no success, i.e., no time index i such that X i = 1. Then, what can you say about the remaining time steps until T, i.e., the random variable T n. Informally, since the future trials are independent of the past, the fact that there has been no success until t should not give any information on the value of T n. Mathematically, for k 1, P(T n = k T > n) P(T n = k T > n) = P(T > n) = (1 p)n+k 1 p (1 p) n = (1 p) k 1 p = P(T = k) So, T behaves as if nothing has happened in the past. In some cases, using this intuition greatly simplifies the task of calculations compared to relying on the algebra of PDF/PMF and CDF. Lecture 4 Hyang-Won Lee 4 / 12

Example A computer executes two types of tasks, priority and nonpriority, and operates in discrete time units (slots). A priority task arrives at the beginning of a time slot with probability p, and arrivals are independent. On the other hand, assume that there are always nonpriority tasks to be executed, and the computer executes nonpriority tasks only when there is no priority task to be processed. A time slot is said to be busy if there is a priority task arrival, and idle otherwise. We are interested in deriving the PMF of the following random variables: T : the time index of the first idle slot B: the length (number of slots) of the first busy period I: the length of the first idle period Z: the number of slots after the first slot of the first busy period up to and including the first subsequent idle slot. It is easy to see that P(T = k) = p k 1 (1 p), k = 1, 2,... P(B = k) = p k 1 (1 p), k = 1, 2,... P(I = k) = (1 p) k 1 p, k = 1, 2,... P(I = k) = p k 1 (1 p), k = 1, 2,... Lecture 4 Hyang-Won Lee 5 / 12

Interarrival Times The success in the Bernoulli process is often referred to as the arrival (of a job, packet, etc.). Let T k be the k-th interarrival time defined as the number of time slots between k 1-th arrival and k-th arrival. Let Y k be the time of k-th arrival. It follows that T 1 = Y 1, T k = Y k Y k 1, k = 2, 3,... Y k = T 1 + T 2 + + T k. By the independence, T 1 is independent of the future interarrival times T 2, T 3,... Furthermore, T 1 is geometric with parameter p. Similarly, for each i > 1, T i is geometric and independent of the other interarrival times. That is, P(T i = k) = (1 p) k 1 p, k = 1, 2,..., i = 1, 2,... Lecture 4 Hyang-Won Lee 6 / 12

Alternative Description of Bernoulli Process The Bernoulli process can be equivalently defined using the interarrival times as follows: Alternative Description of Bernoulli Process. Start with a sequence of independent geometric random variables T 1, T 2,..., with common parameter p, and let these stand for the interarrival times.. Record a success (or arrival) at times T 1, T 1 + T 2, T 1 + T 2 + T 3,... Example: It has been observed that after a rainy day, the number of days until it rains again is geometrically distributed with parameter p, independent of the past. What is the probability that it rains on the 4th, 5th and 9th days of the month? Note that the sequence of days with rainy day a success is the Bernoulli process with parameter p. Consequently, it rains on the 4th, 5th and 9th days with probability p 3. Lecture 4 Hyang-Won Lee 7 / 12

Properties of k-th Arrival Time The properties of Y k can be summarized as follows: Properties of Y k Y k = T 1 + + T k where T i s are independent geometric random variables with parameter p. 2. E[Y k ] = k p and var(y k) = k(1 p) p 2. 3. P(Y k = t) = ( t 1 k 1) p k (1 p) t k, t = k, k + 1,... The PMF of Y k can be derived using the fact that the event {Y k = t} occurs when 1) there are exactly k 1 successes in the first t 1 trials and 2) the t-th trial is a success. Lecture 4 Hyang-Won Lee 8 / 12

Example In each minute of basketball play, Jordan commits a foul with probability p and no foul with probability 1 p. The numbers of fouls in different minutes are independent. Jordan will foul out of the game once he commits his sixth foul, and will play for 30 minutes if he does not foul out. What is the PMF of Jordan s playing time? Let Y k be the minute that he commits k-th foul, and Z be his playing time. Then, we have Z = min(y 6, 30). Thus, we have {( z 1 ) 5 p 6 (1 p) z 6, if z = 6, 7,..., 29 P(Z = z) = 1 29 z=6 ( z 1 ) 5 p 6 (1 p) z 6, if z = 30. Lecture 4 Hyang-Won Lee 9 / 12

Splitting and Merging of Bernoulli Processes Consider splitting a Bernoulli process with parameter p. If there is an arrival, choose to keep it with probability q and to discard it with probability 1 q. The decisions to choose to keep an arrival or not are independent of each other. Clearly, the new arrival process satisfies the assumptions of the Bernoulli process. That is, at every time slot, there is an arrival with probability pq and the arrivals are independent. For the same reason, the process of discarded arrivals is also a Bernoulli process with parameter p(1 q). Consider merging two independent Bernoulli processes, one with parameter p and one with parameter q. When there is an arrival in either of the two processes, it is recorded as an arrival. When there is no arrival in either of the two processes, it is recorded as no arrival. It is easy to check that the resulting process is a Bernoulli process with parameter 1 (1 p)(1 q). Lecture 4 Hyang-Won Lee 10 / 12

Poisson Approximation to Binomial The number of successes in n Bernoulli trials with success probability p is binomial with parameters (n, p). Under some conditions, the binomial probability converges to a Poisson PMF. Poisson Approximation to Binomial For any fixed nonnegative integer k, p S (k) = n! (n k)!k! pk (1 p) n k λ λk p Z (k) = e k!, as n and p = λ/n, while keeping λ constant. The Poisson approximation can be verified by rearranging p S (k) as p S (k) = n n 1 n n n k + 1 n λ k ( 1 λ ) n k k! n Lecture 4 Hyang-Won Lee 11 / 12

Accuracy of Poisson Approximation The following table compares the binomial and Poisson PMFs for n = 100 and p = 0.01 k 0 2 5 10 p X (k) 0.366 0.185 0.00290 7.006 10 8 p Z (k) 0.368 0.184 0.00306 1.001 10 8 For n = 5 and p = 0.1, k 0 1 2 5 p X (k) 0.590 0.328 0.0729 0.00001 p Z (k) 0.605 0.303 0.0758 0.00016 Lecture 4 Hyang-Won Lee 12 / 12