1 Generating functions
|
|
- Jade Douglas
- 5 years ago
- Views:
Transcription
1 1 Generating functions Even quite straightforward counting problems can lead to laborious and lengthy calculations. These are greatly simplified by using generating functions. 2 Definition 1.1. Given a collection of real numbers (a ) 0, the function G(s) = G a (s) def = is called the generating function of (a ) 0. a s (1.1) Definition 1.2. If X is a discrete random variable with values in Z + def = {0, 1,... }, its (probability) generating function, G(s) G X (s) def = E ( s X) = s P(X = ), (1.2) is just the generating function of the probability mass function of X. One of the most useful applications of generating functions in probability theory is the following observation: Theorem 1.3. If X and Y are independent random variables with values in {0, 1, 2,... } and Z def = X + Y, then their generating functions satisfy 3 G Z (s) = G X+Y (s) = G X (s) G Y (s). Example 1.4. Let X 1, X 2,..., X n be independent identically distributed random variables 4 with values in {0, 1, 2,... } and let S n = X X n. Then G Sn (s) = G X1 (s)... G Xn (s) [ G X (s) n. Example 1.5. Let X 1, X 2,..., X n be i.i.d.r.v. with values in {0, 1, 2,... } and let N 0 be an integer-valued random variable independent of {X } 1. Then def S N = X X N has generating function G SN (s) = G N ( GX (s) ). (1.3) Solution. This is a straightforward application of the partition theorem for expectations. Alternatively, the result follows from the standard properties of conditional expectations: E ( z S ) [ N = E E ( z S N N ) [GX = E( (z) ) N ( = G N GX (z) ). 2 introduced by de Moivre and Euler in the early eighteenth century. 3recall: if X and Y are discrete random variables, and f, g : Z + R are arbitrary functions, then f(x) and g(y ) are independent random variables and E [ f(x)g(y ) = Ef(X) Eg(Y ); 4 from now on we shall often abbreviate this to just i.i.d.r.v. 1
2 Example 1.6. [Renewals Imagine a diligent janitor who replaces a light bulb the same day as it burns out. Suppose the first bulb is put in on day 0 and let X i be the lifetime of the ith light bulb. Assume that the individual lifetimes X i are i.i.d.r.v. s with values in {1, 2,... } having a common distribution with generating function G f (s). Consider the probabilities r n def = P ( a light bulb was replaced on day n ), f def = P ( the first light bulb was replaced on day ). We obviously have r 0 = 1, f 0 = 0, and for n 1, r n = f 1 r n 1 + f 2 r n f n r 0 = f r n. A standard computation implies that G r (s) = 1 + G f (s) G r (s) for all s 1, so that G r (s) = 1/(1 G f (s)). In general, we say a sequence (c n ) n 0 is the convolution of (a ) 0 and (b m ) m 0 (write c = a b), if c n = a b n, n 0, (1.4) The ey property of convolutions is given by the following result Theorem 1.7. [Convolution thm If c = a b, then G c (s) = G a (s) G b (s). Why do we care? If the generating function G a (s) of (a n ) n 0 is analytic near the origin, then there is a one-to-one correspondence between G a (s) and (a n ) n 0 ; namely, a can be recovered via 5 a = 1! d ds G a(s) s=0. (1.5) This result is often referred to as the uniqueness property of generating functions. Example 1.8. Let X Poi(λ) and Y Poi(µ) be independent. Then Z = X + Y is Poi(λ + µ). Solution. A straightforward computation gives G X (s) = e λ(s 1), and Theorem 1.3 implies G Z (s) = G X (s) G Y (s) = e λ(s 1) e µ(s 1) e (λ+µ)(s 1), so that the result follows by uniqueness. A similar argument implies the following result. 5 if a power series G a (s) is finite for s < r with r > 0, then it can be differentiated in the dis s < r. 2
3 Example 1.9. If X Bin(n, p) and Y Bin(m, p) are independent, then X + Y Bin(n + m, p). Another useful property of probability generating function G X (s), is that it can be used to compute moments of X: Theorem If X has generating function G(s), then 6 E [ X(X 1)... (X + 1) = G () (1). Remar The quantity E [ X(X 1)... (X + 1) is called the th factorial moment of X. Notice also that Var(X) = G X(1) + G X(1) ( G X(1) ) 2. (1.6) Proof. Fix s (0, 1) and differentiate G(s) times 7 to get G () (s) = E [ s X X(X 1)... (X + 1). Taing the limit s 1 and using the Abel theorem, 8 we obtain Remar Notice also that G () (s) E [ X(X 1)... (X + 1). lim G X(s) lim E[s X = P(X < ). s 1 s 1 This allows us to chec whether a variable is finite, if we do not now this apriori, see Example 1.14 below. Exercise Let S N be defined as in Example 1.5. Use (1.3) to compute E [ S N and Var [ S N in terms of E[X, E[N, Var[X and Var[N. Now chec your result for E [ S N and Var [ SN by directly applying the partition theorem for expectations. Example A bird lays N eggs, each being pin with probability p and blue otherwise. Assuming that N Poi(λ), find the distribution of the total number K of pin eggs. Solution. a) A standard approach in the spirit of Core A gives the value P(K = ) directly; as a by-product of this computation one deduces that K and M = N K are independent (do this!). b) One can also thin in terms of a two-stage probabilistic experiment similar to one described in Example 1.5. Here K has the same distribution as S N with N Poi(λ) and X Ber(p); consequently, G N (s) = e λ(s 1) and G X (s) = 1 + p(s 1), so that G SN (s) = e λp(s 1), ie., K Poi(λp). 6 here, G () (1) is the shorthand for G () (1 ) lim s 1 G () (s), the limiting value of the th derivative of G(s) at s = 1; 7 As G X (s) E s X 1 for all s 1, the generating function G X (s) can be differentiated many times for all s inside the dis, s < 1. 8 The Abel theorem says: If a is non-negative for all and G a (s) is finite for s < 1, then lim G a (s) = a, whether this sum is finite or equals +. By footnote 7, the Abel s 1 theorem applies to all probability generating functions. 3
4 Exercise A mature individual produces immature offspring according to a probability generating function F (s). a) Suppose we start with a population of immature individuals, each of which grows to maturity with probability p and then reproduces, independently of other individuals. Find the probability generating function of the number of immature individuals in the next generation. b) Find the probability generating function of the number of mature individuals in the next generation, given that there are mature individuals in the parent generation. c) Show that the distributions in a) and b) above have the same mean, but not necessarily the same variance. [Hint: You might prefer to first consider the case = 1, and then generalise. The next example is very important for applications. Example Let X, 1 be i.i.d.r.v. s with the common distribution P(X = 1) = p, P(X = 1) = q = 1 p. Define the simple random wal S n, n 0, via S 0 = 0 and, for n 1, S n = X 1 + X X n. Let T be the first time this random wal hits 1, ie, T def = inf { n 1 : S n = 1 }. Find the generating function of T, ie., G T (s) E [ s T. Solution. Write p = P(T = ), so that G T (s) E[s T = 0 s p. Conditioning on the outcome of the first step, and applying the partition theorem for expectations, we get G T (s) E [ s T = E [ s T X 1 = 1 p + E [ s T X 1 = 1 q = ps + qs E [ s T 2, where T 2 is the moment of first visit to state 1 starting from S 0 = 1. By partitioning on the moment T 1 of the first visit to 0, decompose P(T 2 = m) = Of course, on the event {S 0 = 1}, m 1 P ( T 1 =, T 2 = m ). P(T 2 = m S 0 = 1) P ( S 1 < 1,..., S m 1 < 1, S m = 1 S 0 = 1 ), P(T 1 = S 0 = 1) P ( S 1 < 0,..., S 1 < 0, S = 0 S 0 = 1 ), where by translation invariance the last probability is just P(T = S 0 = 0) = p. We also observe that P ( T 2 = m T 1 = ) = P ( first hit 1 from 0 after m steps ) p m. The partition theorem now implies P(T 2 = m) = m 1 P ( T 2 = m T 1 = ) P(T 1 = ) m 1 p p m = m p p m, 4
5 ie., G T2 (s) = ( G T (s) )2. We deduce that G T (s) solves the quadratic equation ϕ = ps + qsϕ 2, so that 9 G T (s) = 1 1 4pqs 2 2qs = 2ps pqs 2. We finally deduce that P(T < ) G T (1) = 1 p q 2q = { 1, p q, p/q, p < q. In particular, E(T ) = for p < q (because P(T = ) = (q p)/q > 0). For p q we obtain { E(T ) G 1 p q 1 T (1) = =, p > q, p q 2q p q, p = q, ie., E(T ) < if p > q and E(T ) = otherwise. Notice that at criticality (p = q = 1/2), the variable T is finite with probability 1, but has infinite expectation. Example In a sequence of independent Bernoulli experiments with success probability p (0, 1), let D be the first moment of two consecutive successful outcomes. Find the generating function of D. Solution. Every sequence contributing to d n = P(D = n) contains at least two successes. The probability of having exactly two successes is q n 2 p 2 (of course, n 2). Otherwise, let 1 < n 2 be the position of the first success in the sequence; it necessarily is followed by a failure, and the remaining n 1 results produce any possible sequence corresponding to the event {D = n 1}. By independence, we deduce the following renewal relation: and a standard method now gives n 3 d n = q n 2 p 2 + q 1 pq d n 1, G D (s) = p2 s 2 1 qs + pqs2 1 qs G D(s), or G D (s) = p 2 s 2 1 qs pqs 2. A straightforward computation gives G D(1) = 1+p, so that on average it taes 42 p 2 tosses of a standard symmetric dice until the first two consecutive sixes appear. Exercise Generalize the argument above to the first moment of m consecutive successes. Exercise In the setup of Example 1.15, show that d 2 = p 2, d 3 = qp 2, and, conditioning on the value of the first outcome, that d n = q d n 1 + pq d n 2, n > 3. Use these relations to again derive the generating function G D (s). 9 by recalling the fact that G T (s) 0 as s 0; 5
6 Exercise Use the method of Exercise 1.17, to derive an alternative solution to Exercise Compare the obtained expectation to that from Example Exercise A biased coin showing heads with probability p (0, 1) is flipped repeatedly. Let C w be the first moment when the word w appears in the observed sequence of results. Find the generating function of C w and the expectation E [ C w for each of the following words: HH, HT, TH and TT. Probability generating functions can also be used to chec convergence of distributions; this is justified by the following general continuity result: Theorem Let for every fixed n the sequence a 0,n, a 1,n,... be a probability distribution, ie., a,n 0 and 0 a,n = 1, and let G n (s) be the corresponding generating function, G n (s) = 0 a,ns. In order that for every fixed lim a,n = a (1.7) n it is necessary and sufficient that for every fixed s [0, 1), lim G n(s) = G(s), n where G(s) = 0 a s, the generating function of the limiting sequence (a ). Remar The convergence in (1.7) is nown as convergence in distribution! Example If X n Bin(n, p) with p = p n satisfying n p n λ as n, then for every fixed s [0, 1) G Xn (s) ( 1 + p n (s 1) ) n exp{λ(s 1)}, so that the distribution of X n converges to that of X Poi(λ). Exercise For n 1 let X (n), = 1,..., n, be independent Bernoulli random variables with Assume that as n and that P(X (n) p (n) = 1) = 1 P(X (n) δ (n) def = max 1 n p(n) 0 E = 0) = p (n). X (n) λ (0, ). Using generating functions or otherwise, show that the distribution of n X(n) converges to that of a Poisson(λ) random variable. This result is nown as the law of rare events. 6
Preliminaries. Probability space
Preliminaries This section revises some parts of Core A Probability, which are essential for this course, and lists some other mathematical facts to be used (without proof) in the following. Probability
More informationPoisson approximations
Chapter 9 Poisson approximations 9.1 Overview The Binn, p) can be thought of as the distribution of a sum of independent indicator random variables X 1 + + X n, with {X i = 1} denoting a head on the ith
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationRVs and their probability distributions
RVs and their probability distributions RVs and their probability distributions In these notes, I will use the following notation: The probability distribution (function) on a sample space will be denoted
More informationConditioning a random variable on an event
Conditioning a random variable on an event Let X be a continuous random variable and A be an event with P (A) > 0. Then the conditional pdf of X given A is defined as the nonnegative function f X A that
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationRandom Models. Tusheng Zhang. February 14, 2013
Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review
More informationGenerating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function
Generating and characteristic functions Generating and Characteristic Functions September 3, 03 Probability generating function Moment generating function Power series expansion Characteristic function
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationConditional distributions (discrete case)
Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationNotes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.
MAS 08 Probability I Notes Autumn 005 Random variables A probability space is a sample space S together with a probability function P which satisfies Kolmogorov s aioms. The Holy Roman Empire was, in the
More information6.041/6.431 Fall 2010 Quiz 2 Solutions
6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential
More informationLecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationDiscrete Random Variables
CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationBinomial and Poisson Probability Distributions
Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What
More information1. Compute the c.d.f. or density function of X + Y when X, Y are independent random variables such that:
Final exam study guide Probability Theory (235A), Fall 2013 The final exam will be held on Thursday, Dec. 5 from 1:35 to 3:00 in 1344 Storer Hall. Please come on time! It will be a closed-book exam. The
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationLecture 6 Random walks - advanced methods
Lecture 6: Random wals - advanced methods 1 of 11 Course: M362K Intro to Stochastic Processes Term: Fall 2014 Instructor: Gordan Zitovic Lecture 6 Random wals - advanced methods STOPPING TIMES Our last
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More informationLecture Notes 3 Convergence (Chapter 5)
Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationProbabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation
EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationNotes 18 : Optional Sampling Theorem
Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection
More informationLecture 3. Discrete Random Variables
Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition
More informationSolutionbank S1 Edexcel AS and A Level Modular Mathematics
Heinemann Solutionbank: Statistics S Page of Solutionbank S Exercise A, Question Write down whether or not each of the following is a discrete random variable. Give a reason for your answer. a The average
More information3. DISCRETE RANDOM VARIABLES
IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose
More informationMassachusetts Institute of Technology
6.04/6.4: Probabilistic Systems Analysis Fall 00 Quiz Solutions: October, 00 Problem.. 0 points Let R i be the amount of time Stephen spends at the ith red light. R i is a Bernoulli random variable with
More informationWe introduce methods that are useful in:
Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationReview of Probability. CS1538: Introduction to Simulations
Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed
More informationDS-GA 1002 Lecture notes 2 Fall Random variables
DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the
More informationMATH Solutions to Probability Exercises
MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe
More informationExample continued. Math 425 Intro to Probability Lecture 37. Example continued. Example
continued : Coin tossing Math 425 Intro to Probability Lecture 37 Kenneth Harris kaharri@umich.edu Department of Mathematics University of Michigan April 8, 2009 Consider a Bernoulli trials process with
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationBernoulli and Binomial
Bernoulli and Binomial Will Monroe July 1, 217 image: Antoine Taveneaux with materials by Mehran Sahami and Chris Piech Announcements: Problem Set 2 Due this Wednesday, 7/12, at 12:3pm (before class).
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationWeek 9 The Central Limit Theorem and Estimation Concepts
Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More information17. Convergence of Random Variables
7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationMAT 135B Midterm 1 Solutions
MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.
More informationFebruary 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM
February 26, 2017 COMPLETENESS AND THE LEHMANN-SCHEFFE THEOREM Abstract. The Rao-Blacwell theorem told us how to improve an estimator. We will discuss conditions on when the Rao-Blacwellization of an estimator
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationa zoo of (discrete) random variables
discrete uniform random variables A discrete random variable X equally liely to tae any (integer) value between integers a and b, inclusive, is uniform. Notation: X ~ Unif(a,b) a zoo of (discrete) random
More information4 Branching Processes
4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationDiscrete Distributions
Chapter 2 Discrete Distributions 2.1 Random Variables of the Discrete Type An outcome space S is difficult to study if the elements of S are not numbers. However, we can associate each element/outcome
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More informationArkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan
2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the
More informationSummary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016
8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationEntropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39
Entropy Probability and Computing Presentation 22 Probability and Computing Presentation 22 Entropy 1/39 Introduction Why randomness and information are related? An event that is almost certain to occur
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationProbability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?
Probability: Why do we care? Lecture 2: Probability and Distributions Sandy Eckel seckel@jhsph.edu 22 April 2008 Probability helps us by: Allowing us to translate scientific questions into mathematical
More informationrandom variables T T T T H T H H
random variables T T T T H T H H random variables A random variable X assigns a real number to each outcome in a probability space. Ex. Let H be the number of Heads when 20 coins are tossed Let T be the
More informationRANDOM WALKS IN ONE DIMENSION
RANDOM WALKS IN ONE DIMENSION STEVEN P. LALLEY 1. THE GAMBLER S RUIN PROBLEM 1.1. Statement of the problem. I have A dollars; my colleague Xinyi has B dollars. A cup of coffee at the Sacred Grounds in
More informationMAS113 Introduction to Probability and Statistics
MAS113 Introduction to Probability and Statistics School of Mathematics and Statistics, University of Sheffield 2018 19 Identically distributed Suppose we have n random variables X 1, X 2,..., X n. Identically
More informationNotes 12 Autumn 2005
MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationSociology 6Z03 Topic 10: Probability (Part I)
Sociology 6Z03 Topic 10: Probability (Part I) John Fox McMaster University Fall 2014 John Fox (McMaster University) Soc 6Z03: Probability I Fall 2014 1 / 29 Outline: Probability (Part I) Introduction Probability
More informationCommon Discrete Distributions
Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationIntroduction to Probability
Introduction to Probability Salvatore Pace September 2, 208 Introduction In a frequentist interpretation of probability, a probability measure P (A) says that if I do something N times, I should see event
More informationISyE 6761 (Fall 2016) Stochastic Processes I
Fall 216 TABLE OF CONTENTS ISyE 6761 (Fall 216) Stochastic Processes I Prof. H. Ayhan Georgia Institute of Technology L A TEXer: W. KONG http://wwong.github.io Last Revision: May 25, 217 Table of Contents
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely
More informationProbability and random variables
Probability and random variables Events A simple event is the outcome of an experiment. For example, the experiment of tossing a coin twice has four possible outcomes: HH, HT, TH, TT. A compound event
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More information2.1 Elementary probability; random sampling
Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems
More informationEdexcel past paper questions
Edexcel past paper questions Statistics 1 Discrete Random Variables Past examination questions Discrete Random variables Page 1 Discrete random variables Discrete Random variables Page 2 Discrete Random
More informationThe random variable 1
The random variable 1 Contents 1. Definition 2. Distribution and density function 3. Specific random variables 4. Functions of one random variable 5. Mean and variance 2 The random variable A random variable
More information4. Discrete Probability Distributions. Introduction & Binomial Distribution
4. Discrete Probability Distributions Introduction & Binomial Distribution Aim & Objectives 1 Aims u Introduce discrete probability distributions v Binomial distribution v Poisson distribution 2 Objectives
More information1 INFO Sep 05
Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationProbability Theory and Simulation Methods
Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters
More informationLecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019
Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial
More informationWeek 2. Review of Probability, Random Variables and Univariate Distributions
Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference
More informationCS 361: Probability & Statistics
February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick
More informationLecture 2: Probability and Distributions
Lecture 2: Probability and Distributions Ani Manichaikul amanicha@jhsph.edu 17 April 2007 1 / 65 Probability: Why do we care? Probability helps us by: Allowing us to translate scientific questions info
More informationLecture 2 Binomial and Poisson Probability Distributions
Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence
More informationData, Estimation and Inference
Data, Estimation and Inference Pedro Piniés ppinies@robots.ox.ac.uk Michaelmas 2016 1 2 p(x) ( = ) = δ 0 ( < < + δ ) δ ( ) =1. x x+dx (, ) = ( ) ( ) = ( ) ( ) 3 ( ) ( ) 0 ( ) =1 ( = ) = ( ) ( < < ) = (
More information