December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers

Size: px
Start display at page:

Download "December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers"

Transcription

1 Simple to Probability Theory Instituto Superior Técnico December 19, 2016

2 Contents Simple to 1 Simple 2 to

3 Contents Simple to 1 Simple 2 to

4 Simple to Theorem - Events with low frequency in a large population follow a distribution even when the probabilities of the events varied. Also known as law of rare events, it was proposed by Ladislaus Josephovich Bortkiewicz in 1898.

5 What do we know? Simple to Bernoulli Distribution X Bernoulli(p) P(X = x) = { p, if x = 1 1 p, if x = 0 where p is the probability of success of a Bernoulli trial. Binomial Distribution If X 1,, X n : then are independent Bernoulli trials; have a constant probability of success p, S = n X j Binomial(n, p)

6 What are we going to see? Simple to Bernoulli Distribution X Bernoulli(p) P(X = x) = { p, if x = 1 1 p, if x = 0 where p is the probability of success of a Bernoulli trial. Binomial Distribution If X 1,, X n : then are independent Bernoulli trials; have eventually different small probabilities of success p j, S = n X j n a p j

7 Bernoulli, Binomial and Distributions Simple to Bernoulli and Binomial Distributions X j Bernoulli(p) S = n X j Binomial(n, p) i.i.d. Bernoulli and Distributions ( ) X j Bernoulli(p j ) S = n a n X j p j indep. Corollary S Binomial(n, p) S a (np)

8 Contents Simple to 1 Simple 2 to

9 Simple to Notation - Random Variables Let us consider a sequence of random variables { X n,j } such that n N and j {1,, n}. This means that for every n N we have n random variables X n,1, X n,2,, X n,n since the index j is a natural number between 1 and n. Notation - Probability of Success p n,j = P(X n,j = 1) Notation - Sum of Interest (also a sequence) S n = X n,1 + X n,2 + + X n,n = n X n,j

10 Theorem Simple to For each n N, let X n,j, with j {1,, n} be independent random variables such that P(X n,j = 1) = p n,j ; P(X n,j = 0) = 1 p n,j. which means that X n,j Bernoulli(p n,j ). Suppose that, as n : n 1 p n,j λ R + ; 2 max j {1,,n} p n,j 0. Then, where S n = X n,1 + + X n,n Z (λ). d Z

11 Auxiliary Lemma 1 Simple to Lemma Let z 1,, z n, w 1,, w n C with modulus not greater than ρ. Then, n z j n w j ρ n 1 n zj w j. Proof By induction over n. For n = 1: z 1 w 1 = ρ 1 1 z 1 w 1. For n > 1: n Induction Hypothesis: z j n w j ρ n 1 for a certain n N. n z j w j

12 Auxiliary Lemma 1 - Proof Simple to Proof (cont.) To prove: n+1 z j n+1 w j ρ n n+1 z j w j for the same n. n+1 n+1 z j w j = n n z n+1 z j w n+1 w j = n n n n = z n+1 z j z n+1 w j + z n+1 w j w n+1 n n z n+1 z j z n+1 w j + n n z n+1 w j w n+1 w j = w j

13 Auxiliary Lemma 1 - Proof Simple to Proof (cont.) n = z n+1 z j n w j + z n+1 w n+1 n I.H. wj n ρ ρ n 1 n+1 zj w j + zn+1 w n+1 ρ n = ρ n z j w j

14 Auxiliary Lemma 2 Simple to Lemma If z C with z 1 then e z (1 + z) z 2. Proof Using Taylor series, e z = 1 + z + z2 2! + z3 3! + z4 4! + = z n n! n=0 e z (1 + z) = z2 2! + z3 3! + z4 4! + = z n n!. n=2

15 Auxiliary Lemma 2 - Proof Simple to Proof (cont.) Since z 1, we have z n z 2, n {2, }. So, e z (1 + z) z 2 = 2 + z3 3! + z4 4! + = z 2 = z 3 + z (1 z ) 2 + = z = = z 2.

16 Theorem - Proof Simple to We are going to prove the result using Levy s Theorem, which states that if ϕ Sn (t) n ϕ Z (t) for all t R and ϕ Z is continuous at t = 0 then S n d Z. For each r.v. X n,j let ϕ n,j be its characteristic function. ( ) ϕ n,j (t) = E e itx n,j = = e it 1 P(X n,j = 1) + e it 0 P(X n,j = 0) = = e it p n,j + 1 (1 p n,j ) = = (1 p n,j ) + p n,j e it

17 Theorem - Proof Simple to So, S n s characteristic function is ) ϕ Sn (t) = E (e itsn = E (e it ) n X n,j = n n ( ) = E e itx n,j = E e itx n,j = indep. = = n ϕ n,j (t) = n n [1 + p n,j ( e it 1) ]. [ (1 p n,j ) + p n,j e it] =

18 Theorem - Proof Remember that if Z (λ), its probability mass function is given by Simple to P(Z = z) = so its characteristic function is ϕ Z (t) = E = ( e itz ) = z=0 {e λ λz z!, if z N 0 0, otherwise e itz P(Z = z) = z=0 e itz λ λz e z! = e λ ) z (λ e it z=0 = e λ e λ eit = e λ(eit 1). z! =

19 Theorem - Proof Simple to To prove that ϕ Sn (t) ϕ Z (t) n we are going to show that ϕ Sn (t) ϕ Z (t) 0. n Consider a sequence of r.v. n Y n p n,j so that ϕ Yn (t) = e n p n,j(e it 1). Since, by assumption, n p n,j λ, we have n ϕ Yn (t) ϕ Z (t) so, by Levy s Theorem,

20 Theorem - Proof Simple to d Y n Z. ϕ Sn (t) ϕ Yn (t) n ( ) = [1 ] + p n,j e it 1 e n n ( = [1 + p n,j e it 1) ] n e p n,j(e it 1) Note that 0 p n,j 1 (it s a probability); e it = 1 (it defines a unitary circle); ( 1 + p n,j e it 1) 1; e p n,j(e it 1) [ ] p = e n,j Re(e it ) 1 1 (exponent 0). p n,j(e it 1) =

21 Theorem - Proof Simple to Using auxiliary lemmas 1 and 2, with max j p n,j 1 2 and e it 1 e it = 2, n n 1 + p n,j = [1 + p n,j ( e it 1) ] ( ) e it 1 n ep n,j(e it 1) n p n,j ( ) e it 2 1 = n e p n,j(e it 1) e p n,j(e it 1) = ( ) [1 ] + p n,j e it L2 1 n p 2 n,j e it 1 2 L1

22 Theorem - Proof Simple to Since n n p n,j max j { pn,j } 2 2 = 4 max j { } n pn,j p n,j 0. p n,j λ and ϕ Yn (t) ϕ Z (t) as n, ϕ Sn (t) ϕ Z (t) = ϕ Sn (t) ϕ Yn (t) + ϕ Yn (t) ϕ Z (t) ϕ Sn (t) ϕ Yn (t) + ϕ Yn (t) ϕ Z (t) = 0 we conclude that ϕ Sn ϕ Z so, by Levy s Theorem, S n d Z.

23 Contents Simple to 1 Simple 2 to

24 Some Simple to

25 World War II Example Simple to Let U n,j Unif [ n, n], indep. { 1, if U n,j ]a, b[ [ n, n] X n,j = 0, otherwise and S n = n X n,j the number of points that land on ]a, b[. So p n,j = P(X n,j = 1) = P(U n,j ]a, b[) = n p n,j = n b a 2n = b a b a max j {1,,n} p n,j = max j {1,,n} 2n S n d Z ( b a 2 b a 1 2n du = b a 2n 2 R + does not depend on n; = b a 2n 0. ),

26 World War II Example Simple to The flying bomb hits in South London during World War II fit a distribution. The area was divided on 576 regions with the same area. The total number of hits was 537. Adapting our last result for R 2, let S be the number of regions with k hits. ( ) S a k P(S = k) Real Values

27 Contents Simple to 1 Simple 2 to

28 Simple to Theorem - For each n N, let X n,j, with j {1,, n} be independent non-negative integer valued random variables such that P(X n,j = 1) = p n,j ; P(X n,j 2) = ɛ n,j. Suppose that, as n : n 1 p n,j λ R + ; 2 max j {1,,n} p n,j 0 ; n 3 ɛ n,j 0. Then, where S n = X n,1 + + X n,n Z (λ). d Z

29 Simple to Proof Let and X n,j = { 1, if X n,j = 1 0, otherwise S n = X n,1 + + X n,n = n X n,j. By the first two assumptions and using the, we conclude that S n d Z.

30 Simple to Proof (cont.) By the third assumption, we know that P(S n S n) 0. Using the Converging Together Lemma, d S n Z.

31 Simple Converging Together Lemma If d X n X ; then Y n d c where c R is a constant, d X n + Y n X + c. to Corollary If then S n d Z ; S n S n d 0, S n d Z.

32 Contents Simple to 1 Simple 2 to

33 Simple to Definition - Counting (in Continuous Time) A family of r.v. { N(t) : t 0 } is said to be a counting process if it represents the total number of events that have occurred up to time t. It must satisfy: N(t) N 0, t 0 ; N(s) N(t), 0 s < t ; N(t) N(s) corresponds to the number of events that occurred in the interval ]s, t], 0 s < t.

34 Simple to Definition - Counting with Independent Increments The counting process { N(t) : t 0 } is said to have independent increments if the number of events that occur in disjoint intervals are independent r.v., that is, if 0 = t 0 < t 1 < < t n then N(t 1 ), N(t 2 ) N(t 1 ),..., N(t n ) N(t n 1 ) are independent r.v. Definition - Counting with Stationary Increments The counting process { N(t) : t 0 } is said to have stationary increments if the distribution of the number of events that occur in any interval depends only on the length of the interval, that is, N(t 2 + s) N(t 1 + s) d = N(t 2 ) N(t 1 ), s 0, 0 t 1 < t 2.

35 Simple to Definition 1 - The counting process { N(t) : t 0 } is said to be a process with rate λ > 0 if: { N(t) : t 0 } has independent and stationary increments; N(t) (λt). This is the usual definition, but it is redundant since the second condition follows from the first one.

36 Simple to Definition 2 - The counting process { N(t) : t 0 } is said to be a process with rate λ > 0 if: lim h 0 N(0) = 0; { N(t) : t 0 } has independent and stationary increments; P[N(h) = 1] = λh + o(h); P[N(h) 2] = o(h) where o(h) stands for functions g 1 (h) and g 2 (h) such that g i (h) h = 0, i {1, 2}. Let us check that this definition implies that N(t) (λt).

37 Simple to Theorem Let { N(t) : t 0 } be a counting process in continuous time. Suppose that: N(0) = 0; { N(t) : t 0 } has independent and stationary increments; P[N(h) = 1] = λh + o(h); P[N(h) 2] = o(h) where o(h) stands for functions g 1 (h) and g 2 (h) such that g lim i (h) h 0 h = 0, i {1, 2}. Then, N(0, t) (λt).

38 Simple to Proof Let X n,j = N ( ) ( ) j n t N j 1 n t. [ ( j p n,j = P(X n,j = 1) = P N n t [ ( ) ] t = P N N(0) = 1 n ) = λ t ( t n + o n n n p n,j = P(X n,j = 1) = ) ( j 1 N n = P [ N ( ) t n ) ] t = 1 = = 1 ] = [ n λ t ( ) ] t n + o λt. n

39 Simple to Proof (cont.) { max p n,j = max λ t n + o ( t n) } = p n,j j {1,,n} j {1,,n} because it does not depend on j. Therefore, as n, max p n,j = λ t ( ) t j {1,,n} n + o 0. n Finally, [ ( j 1 ɛ n,j = P(X n,j 2) = P N n [ ( ) ] t = P N N(0) 2 = o n ) ( j t N ( ) t 0 n ) ] n t 2 =

40 Simple to Proof (cont.) thus n ɛ n,j 0. We are on the conditions of the last theorem, since X n,j are non-negative integer valued r.v. with p n,j = P(X n,j = 1), ɛ n,j = P(X n,j 2) and n 1 p n,j λt R + ; 2 max j {1,,n} p n,j 0 ; n 3 ɛ n,j 0. Therefore, n X n,j d N(t) (λt).

41 Bibliography I Appendix Bibliography Rick Durrett. Probability Theory and. CUP, Isabel Rodrigues. Apontamentos de Complementos de Probabilidades e Estatística. Instituto Superior Técnico, Manuel Cabral Morais. Lecture Notes - Probability Theory. Instituto Superior Técnico, Manuel Cabral Morais. Lecture Notes - Stochastic es. Instituto Superior Técnico, 2014.

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014 Lecture 13 Text: A Course in Probability by Weiss 5.5 STAT 225 Introduction to Probability Models February 16, 2014 Whitney Huang Purdue University 13.1 Agenda 1 2 3 13.2 Review So far, we have seen discrete

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

MAS113 Introduction to Probability and Statistics

MAS113 Introduction to Probability and Statistics MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Stat 134 Fall 2011: Notes on generating functions

Stat 134 Fall 2011: Notes on generating functions Stat 3 Fall 0: Notes on generating functions Michael Lugo October, 0 Definitions Given a random variable X which always takes on a positive integer value, we define the probability generating function

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

4 Branching Processes

4 Branching Processes 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Math/Stat 352 Lecture 8

Math/Stat 352 Lecture 8 Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Independence is a basic property of events and r.v. in a probability model.

Independence is a basic property of events and r.v. in a probability model. Chapter 3 Independence Independence is a basic property of events and r.v. in a probability model. 3.1 Fundamentals Motivation 3.1 Independence (Resnick, 1999, p. 91; Karr, 1993, p. 71) The intuitive appeal

More information

Binomial and Poisson Probability Distributions

Binomial and Poisson Probability Distributions Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Convergence Rates for Renewal Sequences

Convergence Rates for Renewal Sequences Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous

More information

Random point patterns and counting processes

Random point patterns and counting processes Chapter 7 Random point patterns and counting processes 7.1 Random point pattern A random point pattern satunnainen pistekuvio) on an interval S R is a locally finite 1 random subset of S, defined on some

More information

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014 Lecture 20 Text: A Course in Probability by Weiss 12.1 STAT 225 Introduction to Probability Models March 26, 2014 Whitney Huang Purdue University 20.1 Agenda 1 2 20.2 For a specified event that occurs

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

ST 371 (V): Families of Discrete Distributions

ST 371 (V): Families of Discrete Distributions ST 371 (V): Families of Discrete Distributions Certain experiments and associated random variables can be grouped into families, where all random variables in the family share a certain structure and a

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Lecture 12. Poisson random variables

Lecture 12. Poisson random variables 18.440: Lecture 12 Poisson random variables Scott Sheffield MIT 1 Outline Poisson random variable definition Poisson random variable properties Poisson random variable problems 2 Outline Poisson random

More information

STAT 380 Markov Chains

STAT 380 Markov Chains STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65

APM 504: Probability Notes. Jay Taylor Spring Jay Taylor (ASU) APM 504 Fall / 65 APM 504: Probability Notes Jay Taylor Spring 2015 Jay Taylor (ASU) APM 504 Fall 2013 1 / 65 Outline Outline 1 Probability and Uncertainty 2 Random Variables Discrete Distributions Continuous Distributions

More information

Fundamental Tools - Probability Theory IV

Fundamental Tools - Probability Theory IV Fundamental Tools - Probability Theory IV MSc Financial Mathematics The University of Warwick October 1, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory IV 1 / 14 Model-independent

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Chapter 3. Chapter 3 sections

Chapter 3. Chapter 3 sections sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions

More information

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all Lecture 6 1 Lecture 6 Probability events Definition 1. The sample space, S, of a probability experiment is the collection of all possible outcomes of an experiment. One such outcome is called a simple

More information

Conditional Sampling for Max Stable Random Fields

Conditional Sampling for Max Stable Random Fields Conditional Sampling for Max Stable Random Fields Yizao Wang Department of Statistics, the University of Michigan April 30th, 0 th GSPC at Duke University, Durham, North Carolina Joint work with Stilian

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

1 Bernoulli Distribution: Single Coin Flip

1 Bernoulli Distribution: Single Coin Flip STAT 350 - An Introduction to Statistics Named Discrete Distributions Jeremy Troisi Bernoulli Distribution: Single Coin Flip trial of an experiment that yields either a success or failure. X Bern(p),X

More information

Chapter 3 Discrete Random Variables

Chapter 3 Discrete Random Variables MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

1 Inverse Transform Method and some alternative algorithms

1 Inverse Transform Method and some alternative algorithms Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it

More information

arxiv: v2 [math.pr] 4 Sep 2017

arxiv: v2 [math.pr] 4 Sep 2017 arxiv:1708.08576v2 [math.pr] 4 Sep 2017 On the Speed of an Excited Asymmetric Random Walk Mike Cinkoske, Joe Jackson, Claire Plunkett September 5, 2017 Abstract An excited random walk is a non-markovian

More information

Discrete Distributions

Discrete Distributions Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Stochastic Models (Lecture #4)

Stochastic Models (Lecture #4) Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence

More information

Theory of Stochastic Processes 3. Generating functions and their applications

Theory of Stochastic Processes 3. Generating functions and their applications Theory of Stochastic Processes 3. Generating functions and their applications Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo April 20, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html

More information

Summer School in Statistics for Astronomers & Physicists June 5-10, 2005

Summer School in Statistics for Astronomers & Physicists June 5-10, 2005 Summer School in Statistics for Astronomers & Physicists June 5-10, 2005 Session on Statistical Inference for Astronomers Poisson Processes and Gaussian Processes Donald Richards Department of Statistics

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

Chapter 4 - Lecture 3 The Normal Distribution

Chapter 4 - Lecture 3 The Normal Distribution Chapter 4 - Lecture 3 The October 28th, 2009 Chapter 4 - Lecture 3 The Standard Chapter 4 - Lecture 3 The Standard Normal distribution is a statistical unicorn It is the most important distribution in

More information

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution 1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

18.175: Lecture 17 Poisson random variables

18.175: Lecture 17 Poisson random variables 18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More

More information

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13

Kousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13 Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami

More information

STOR Lecture 8. Random Variables - II

STOR Lecture 8. Random Variables - II STOR 435.001 Lecture 8 Random Variables - II Jan Hannig UNC Chapel Hill 1 / 25 Binomial Random variables Main properties Arise in describing the number of successes in n independent trials where each trial

More information

TMA 4265 Stochastic Processes

TMA 4265 Stochastic Processes TMA 4265 Stochastic Processes Norges teknisk-naturvitenskapelige universitet Institutt for matematiske fag Solution - Exercise 9 Exercises from the text book 5.29 Kidney transplant T A exp( A ) T B exp(

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups. Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability

More information

PROBABILITY THEORY LECTURE 3

PROBABILITY THEORY LECTURE 3 PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE

More information

SUPPLEMENT TO MARKOV PERFECT INDUSTRY DYNAMICS WITH MANY FIRMS TECHNICAL APPENDIX (Econometrica, Vol. 76, No. 6, November 2008, )

SUPPLEMENT TO MARKOV PERFECT INDUSTRY DYNAMICS WITH MANY FIRMS TECHNICAL APPENDIX (Econometrica, Vol. 76, No. 6, November 2008, ) Econometrica Supplementary Material SUPPLEMENT TO MARKOV PERFECT INDUSTRY DYNAMICS WITH MANY FIRMS TECHNICAL APPENDIX (Econometrica, Vol. 76, No. 6, November 2008, 1375 1411) BY GABRIEL Y. WEINTRAUB,C.LANIER

More information

Probability Distributions - Lecture 5

Probability Distributions - Lecture 5 Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

APM 421 Probability Theory Discrete Random Variables. Jay Taylor Fall Jay Taylor (ASU) Fall / 86

APM 421 Probability Theory Discrete Random Variables. Jay Taylor Fall Jay Taylor (ASU) Fall / 86 APM 421 Probability Theory Discrete Random Variables Jay Taylor Fall 2013 Jay Taylor (ASU) Fall 2013 1 / 86 Outline 1 Motivation 2 Infinite Sets and Cardinality 3 Countable Additivity 4 Discrete Random

More information

AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM

AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM AN UNDERGRADUATE LECTURE ON THE CENTRAL LIMIT THEOREM N.V. KRYLOV In the first section we explain why the central limit theorem for the binomial 1/2 distributions is natural. The second section contains

More information

Convergence in Distribution

Convergence in Distribution Convergence in Distribution Undergraduate version of central limit theorem: if X 1,..., X n are iid from a population with mean µ and standard deviation σ then n 1/2 ( X µ)/σ has approximately a normal

More information