Basics of Stochastic Modeling: Part II

Similar documents
Northwestern University Department of Electrical Engineering and Computer Science

2 Continuous Random Variables and their Distributions

Exponential Distribution and Poisson Process

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

STAT Chapter 5 Continuous Distributions

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

The exponential distribution and the Poisson process

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Stochastic Models in Computer Science A Tutorial

Math438 Actuarial Probability

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Continuous Random Variables

STAT 380 Markov Chains

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Expected Values, Exponential and Gamma Distributions

Stat 512 Homework key 2

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Math Spring Practice for the Second Exam.

General Random Variables

1.1 Review of Probability Theory

STAT509: Continuous Random Variable

Will Landau. Feb 21, 2013

Chapter 4: Continuous Probability Distributions

1 Joint and marginal distributions

Chapter 3: Random Variables 1

Sampling Distributions

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

SDS 321: Introduction to Probability and Statistics

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

3 Continuous Random Variables

Probability and Statistics Concepts

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Continuous Random Variables

Expected Values, Exponential and Gamma Distributions

Continuous Probability Distributions. Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

1 Probability and Random Variables

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture 11. Probability Theory: an Overveiw

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Continuous random variables

Probability Density Functions

CIVL Continuous Distributions

Routing Games 1. Sandip Chakraborty. Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR.

Continuous Random Variables

4 Pairs of Random Variables

Math Review Sheet, Fall 2008

Chapter 2: Random Variables

Introduction to Statistical Inference Self-study

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Basic concepts of probability theory

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

STAT 3610: Review of Probability Distributions

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Chapter 4: Continuous Random Variable

Stat410 Probability and Statistics II (F16)

Basic concepts of probability theory

Stochastic Models of Manufacturing Systems

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Slides 8: Statistical Models in Simulation

Chapter 3: Random Variables 1

Sampling Distributions

p. 6-1 Continuous Random Variables p. 6-2

1 Random Variable: Topics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

STA 256: Statistics and Probability I

SDS 321: Introduction to Probability and Statistics

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Probability and Distributions

Lecture 4a: Continuous-Time Markov Chain Models

STAT 231 Homework 5 Solutions

2 Functions of random variables

Basic concepts of probability theory

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Review: mostly probability and some statistics

Actuarial Science Exam 1/P

2 (Statistics) Random variables

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Continuous Distributions

THE QUEEN S UNIVERSITY OF BELFAST

Solution to Assignment 3

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Gamma and Normal Distribuions

CS145: Probability & Computing

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

M/G/1 and M/G/1/K systems

Multiple Random Variables

Transcription:

Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference Book: Probability & Statistics with Reliability, Queuing and Computer Science Applications, by Kishor S. Trivedi Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 0 / 18

Random Variables in Continuous Space When the sample space S is non-denumerable, not every subset of the sample space is an event that can be assigned a probability. If X is to be a random variable, we require that P(X x) be well defined for every real number X. Let F denote the class of measurable subset of S. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 1 / 18

Random Variables in Continuous Space When the sample space S is non-denumerable, not every subset of the sample space is an event that can be assigned a probability. If X is to be a random variable, we require that P(X x) be well defined for every real number X. Let F denote the class of measurable subset of S. A random variable (continuous) X on a probability space (S, F, P) is a function X : S R that assigns a real number X (s) to each sample point s S, such that for every real number x, the set {s X (s) x} is a member of F. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 1 / 18

Random Variables in Continuous Space When the sample space S is non-denumerable, not every subset of the sample space is an event that can be assigned a probability. If X is to be a random variable, we require that P(X x) be well defined for every real number X. Let F denote the class of measurable subset of S. A random variable (continuous) X on a probability space (S, F, P) is a function X : S R that assigns a real number X (s) to each sample point s S, such that for every real number x, the set {s X (s) x} is a member of F. The cumulative distribution function F X of a random variable X is defined to be the function F X (s) = P(X x) for < x <. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 1 / 18

Cumulative Distribution Function A continuous random variable X is characterized by a distribution function F X (x) that is a continuous function of x for all < x <. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 2 / 18

Cumulative Distribution Function A continuous random variable X is characterized by a distribution function F X (x) that is a continuous function of x for all < x <. The continuous uniform distribution is given by, 0, x < a x a F (x) = b a, a x < b 1, x b Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 2 / 18

Probability Density Function For a continuous random variable X, f (x) = df (x)/dx is called the probability density function (pdf) of X. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 3 / 18

Probability Density Function For a continuous random variable X, f (x) = df (x)/dx is called the probability density function (pdf) of X. The pdf enables us to obtain the CDF by integrating under the pdf; F X (x) = P(X x) = x f X (t)dt, < x < Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 3 / 18

Probability Density Function We can also obtain other probabilities of interest, as P(X (a, b]) = P(a < X b) = P(X b) P(X a) = b f X (t)dt a f X (t)dt b = f X (t)dt a Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 4 / 18

Probability Density Function We can also obtain other probabilities of interest, as P(X (a, b]) = P(a < X b) = P(X b) P(X a) f (x) has following properties; 1) f (x) 0 for all x 2) f (x)dx = 1 b a = f X (t)dt f X (t)dt b = f X (t)dt a Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 4 / 18

Practice Problem Show that the CDF of a continuous random variable does not have any jumps; i.e. P(X = c) is zero. Proof: Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 5 / 18

Practice Problem Show that the CDF of a continuous random variable does not have any jumps; i.e. P(X = c) is zero. Proof: P(X = c) = P(c X c) = c c f x (y)dy = 0 Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 5 / 18

Practice Problem Show that the CDF of a continuous random variable does not have any jumps; i.e. P(X = c) is zero. Proof: P(X = c) = P(c X c) = Show that for a continuous random variable X, c c f x (y)dy = 0 P(a X b) = P(a < X b) = P(a x < b) = P(a < X < b) = b a f X (x)dx Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 5 / 18

The Exponential Distribution The exponential distribution function, is given by { 1 e F (x) = λx, if 0 x < 0 Otherwise Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 6 / 18

The Exponential Distribution The exponential distribution function, is given by { 1 e F (x) = λx, if 0 x < 0 Otherwise The pdf of an exponential random variable X is given as; { λe f (x) = λx, if 0 x < 0 Otherwise Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 6 / 18

Markov Property of Exponential Distribution Suppose we know that X, an exponential random variable, exceeds some given value t; i.e. X > t (Ex: X is the lifetime of a component, and we have observed that the component has already been operating for t hours) Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 7 / 18

Markov Property of Exponential Distribution Suppose we know that X, an exponential random variable, exceeds some given value t; i.e. X > t (Ex: X is the lifetime of a component, and we have observed that the component has already been operating for t hours) We are interested in the distribution of Y = X t : the remaining or residual lifetime Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 7 / 18

Markov Property of Exponential Distribution Suppose we know that X, an exponential random variable, exceeds some given value t; i.e. X > t (Ex: X is the lifetime of a component, and we have observed that the component has already been operating for t hours) We are interested in the distribution of Y = X t : the remaining or residual lifetime Let G t (y) denote the conditional probability of Y y given that X > t. Therefore, G t (y) = P(Y y X > t) Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 7 / 18

Markov Property of Exponential Distribution Suppose we know that X, an exponential random variable, exceeds some given value t; i.e. X > t (Ex: X is the lifetime of a component, and we have observed that the component has already been operating for t hours) We are interested in the distribution of Y = X t : the remaining or residual lifetime Let G t (y) denote the conditional probability of Y y given that X > t. Therefore, G t (y) = P(Y y X > t) Show that G t (y) is independent of t. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 7 / 18

Markov Property of Exponential Distribution G t (y) = P(Y y X > t) = P(X t y X > t) = P(X y + t X > t) P(X y + t and X > t) = P(X > t) P(t < X y + t) = P(X > t) = y+t t t f (x)dx f (x)dx Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 8 / 18

Markov Property of Exponential Distribution G t (y) = y+t t t λe λx dx λe λx dx = e λt (1 e λy ) e λt = 1 e λy Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 9 / 18

Markov Property of Exponential Distribution G t (y) = y+t t t λe λx dx λe λx dx = e λt (1 e λy ) e λt = 1 e λy Thus G t (y) is independent of t and is identical to the original exponential distribution of X. The distribution of remaining life does not depend on how long the component has been operating. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 9 / 18

Markov Property of Exponential Distribution y+t λe λx dx t If the inter-arrival times for G t (y) a network = traffic are exponentially distributed, then the Markov (memoryless) property λe λx implies dx that the time we must t wait for a new packet arrival is statistically independent = e λt (1 e λy of how long we have already spent waiting for it! ) e λt = 1 e λy Thus G t (y) is independent of t and is identical to the original exponential distribution of X. The distribution of remaining life does not depend on how long the component has been operating. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 9 / 18

An Interesting Statistical Observation If X is a non-negative continuous random variable with the Markov property, then show that the distribution of X must be exponential. Proof Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 10 / 18

An Interesting Statistical Observation If X is a non-negative continuous random variable with the Markov property, then show that the distribution of X must be exponential. Proof For a random variable with Markov property, P(t < X y + t) P(X > t) = P(X y) = P(0 < X y) Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 10 / 18

An Interesting Statistical Observation If X is a non-negative continuous random variable with the Markov property, then show that the distribution of X must be exponential. Proof For a random variable with Markov property, P(t < X y + t) P(X > t) = P(X y) = P(0 < X y) Therefore, F X (y + t) F X (t) 1 F X (t) = F X (y) F X (0) F X (y + t) F X (t) = [1 F X (t)][f X (y) F X (0)] Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 10 / 18

Since F X (0) = 0, rearranging the equation we can write, F X (y + t) F X (y) t = F X (t) [1 F X (y)] t Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 11 / 18

Since F X (0) = 0, rearranging the equation we can write, F X (y + t) F X (y) t Taking limit at both sides; = F X (t) [1 F X (y)] t F X (y + t) F X (y) F X (0 + t) F X (0) lim = lim [1 F X (y)] t 0 t t 0 t Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 11 / 18

Since F X (0) = 0, rearranging the equation we can write, F X (y + t) F X (y) t Taking limit at both sides; = F X (t) [1 F X (y)] t F X (y + t) F X (y) F X (0 + t) F X (0) lim = lim [1 F X (y)] t 0 t t 0 t Let F X (y) = df X (y)/dy. Then, F X (y) = F X (0)[1 F X (y)] Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 11 / 18

Since F X (0) = 0, rearranging the equation we can write, F X (y + t) F X (y) t Taking limit at both sides; = F X (t) [1 F X (y)] t F X (y + t) F X (y) F X (0 + t) F X (0) lim = lim [1 F X (y)] t 0 t t 0 t Let F X (y) = df X (y)/dy. Then, F X (y) = F X (0)[1 F X (y)] Let R X (y) = 1 F X (y), Then R X (y) = F X (y). Replacing the values, R X (y) = R X (0)R X (y) Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 11 / 18

The solution to this differential equation is given by; R X (y) = Ke R X (0)y Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 12 / 18

The solution to this differential equation is given by; R X (y) = Ke R X (0)y R X (0) = F X (0) = f X (0) which is a constant. Further R X (0) = 1. Denoting f X (0) by the constant λ, we get, R X (y) = e λy Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 12 / 18

The solution to this differential equation is given by; R X (y) = Ke R X (0)y R X (0) = F X (0) = f X (0) which is a constant. Further R X (0) = 1. Denoting f X (0) by the constant λ, we get, R X (y) = e λy Therefore, F X (y) = 1 e λy Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 12 / 18

Order Statistics Let X 1, X 2,..., X n be mutually independent, identically distributed continuous random variables, each having the distribution function F and density f. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 13 / 18

Order Statistics Let X 1, X 2,..., X n be mutually independent, identically distributed continuous random variables, each having the distribution function F and density f. Let Y 1, Y 2,..., Y n be random variables obtained by permuting the set X 1, X 2,..., X n so as to be in increasing order; Y 1 = min{x 1, X 2,..., X n } Y n = max{x 1, X 2,..., X n } Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 13 / 18

Order Statistics Let X 1, X 2,..., X n be mutually independent, identically distributed continuous random variables, each having the distribution function F and density f. Let Y 1, Y 2,..., Y n be random variables obtained by permuting the set X 1, X 2,..., X n so as to be in increasing order; Y 1 = min{x 1, X 2,..., X n } Y n = max{x 1, X 2,..., X n } The random variable Y k is called the kth-order statistics Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 13 / 18

Order Statistics Let X 1, X 2,..., X n be mutually independent, identically distributed continuous random variables, each having the distribution function F and density f. Let Y 1, Y 2,..., Y n be random variables obtained by permuting the set X 1, X 2,..., X n so as to be in increasing order; Y 1 = min{x 1, X 2,..., X n } Y n = max{x 1, X 2,..., X n } The random variable Y k is called the kth-order statistics Example: Let X i be the lifetime of the ith component in a system of n independent components. If the system is a series system, then Y 1 be the overall system lifetime If the system is a parallel system, then Y n be the overall system lifetime Y n m+1 will be the lifetime of an m-out-of-n system (m failures can sustain) Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 13 / 18

Joint Distribution Function The joint (or compound) distribution function of random variables X and Y is defined by; F X,Y (x, y) = P(X x, Y y) < x < ; < y < Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 14 / 18

Joint Distribution Function The joint (or compound) distribution function of random variables X and Y is defined by; F X,Y (x, y) = P(X x, Y y) < x < ; < y < The joint (compound) probability density function of X and Y is given by f (x, y); we can often find a function f (x, y) such that; F X,Y (x, y) = y x f (u, v)dudv Therefore; f (x, y) = δ2 F X,Y (x, y) δxδy Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 14 / 18

Expectation The expectation, E[X ], of a random variable X is defined by; x i P(x i ); if X is discrete, i E[X ] = xf (x)dx; if X is continuous Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 15 / 18

Expectation The expectation, E[X ], of a random variable X is defined by; x i P(x i ); if X is discrete, i E[X ] = xf (x)dx; if X is continuous The expectation for a random variable X with exponential density; Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 15 / 18

Expectation The expectation, E[X ], of a random variable X is defined by; x i P(x i ); if X is discrete, i E[X ] = xf (x)dx; if X is continuous The expectation for a random variable X with exponential density; E[X ] = xf (x)dx = λxe λx dx 0 Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 15 / 18

Expectation The expectation, E[X ], of a random variable X is defined by; x i P(x i ); if X is discrete, i E[X ] = xf (x)dx; if X is continuous The expectation for a random variable X with exponential density; E[X ] = xf (x)dx = λxe λx dx Let u = λx, then du = λdx, therefore; 0 E[X ] = 1 λ 0 ue u du = 1 λ Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 15 / 18

Central Moments We define the k th central moment, µ k, of the random variable X by; µ k = E[(X E[X ]) k ] Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 16 / 18

Central Moments We define the k th central moment, µ k, of the random variable X by; µ k = E[(X E[X ]) k ] The zeroth central moment is the total probability (i.e. one), the first central moment is zero, the second central moment is the variance and the third central moment is the skewness. Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 16 / 18

Mean and Median Mean is the statistical the average of all data points the expected value E[X ] Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 17 / 18

Mean and Median Mean is the statistical the average of all data points the expected value E[X ] Median is the point that separates the lower half of the data points from the higher half. Statistically, median is a point m such that; and; P(X m) 1 2 P(X m) 1 2 For an absolutely continuous probability distribution function f (x) P(X m) = P(X m) = m f (x)dx = 1 2 Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 17 / 18

Mode and Skewness The mode is a value that appears most in a data set. For discrete distribution the value x at which pmf takes maximum value For continuous distribution the value x at which pdf has its maximum value Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 18 / 18

Thank You Sandip Chakraborty (IIT Kharagpur) CS 60019 August 10, 2016 18 / 18