Modeling Rare Events

Size: px
Start display at page:

Download "Modeling Rare Events"

Transcription

1 Modeling Rare Events Chapter 4 Lecture 15 Yiren Ding Shanghai Qibao Dwight High School April 24, 2016 Yiren Ding Modeling Rare Events 1 / 48

2 Outline 1 The Poisson Process Three Properties Stronger Property 2 Exponential Distribution Exponential CDF Memoryless Property 3 Clustering Effect Exponential Density Clustering Effect Explained 4 Simulating Poisson Process Uniform Distribution Poisson Process Algorithm 5 Merging and Splitting of Poisson Process Mozart-Requiem Lacrimosa Yiren Ding Modeling Rare Events 2 / 48

3 The Poisson Process Three Properties Definition 1 (The Poisson Process). Consider the occurrence of a rare event as customers arriving at a facility. Suppose the number of potential customers is unlimited, and the customers act independently. The arrival is a Poisson process if A the customers arrive one at a time B the numbers of arrivals during non-overlapping time intervals are independent of one another C the probability mass function of the number of arrivals in any time interval (s, s + t) depends only on the length t of the interval and not on its position s on the time axis Examples of Poisson process include: - the emission of particles from a radioactive source - the occurrence of serious earthquakes, fires, power outages - the arrival of urgent calls to an emergency center Yiren Ding Modeling Rare Events 3 / 48

4 The Poisson Process Stronger Property Theorem 1 (Stronger Version of Property C). In a Poisson process, the number of arrivals during any given time interval has a Poisson distribution of which the expected value is proportional to the duration of the interval, i.e., P(k arrivals during a given time interval of duration t) αt (αt)k = e k! for k = 0, 1,... (1) where α is the arrival intensity representing the expected number of arrivals during a given time interval of unit length. Proof. First we show that the number of arrivals during any time interval of unit length has a Poisson distribution Po(α). To do that we partition the time interval of length 1 into n subintervals of equal length t = 1 n, where n is sufficiently large. Yiren Ding Modeling Rare Events 4 / 48

5 The Poisson Process Stronger Property Theorem 1 proof The numbers of arrivals in the n subintervals are i.i.d. by properties B and C; hence, denote by p the probability that precisely one customer will arrive in each of the n subintervals. Each subinterval can be viewed as an independent Bernoulli trial with success probability p, so for a fixed n, the number of arrivals follows a binomial distribution B(n, p). (It is Bernoulli because the probability of having two or more customers arriving is negligible for large n.) As n, we have p 0, (since np = α is fixed). It follows that the number of arrivals during an interval of unit length has a Poisson distribution with the expected value np = α. Therefore, p = α/n = α t. This means that the probability that precisely one customer will arrive in any time interval is proportional to the duration of the time interval by a factor of α. Yiren Ding Modeling Rare Events 5 / 48

6 The Poisson Process Stronger Property Theorem 1 proof We can repeat the same process with arbitrary time interval of duration t with n = t t, and p = α t. If follows by similar argument that the number of arrivals during an interval of length t has a Poisson distribution with expected value λ = np = t α t = αt. t Thus we may replace condition C with this stronger version. Finally note that condition B is critical for the Poisson process and cannot be satisfied unless the population of customers is very large. This explains why a Poisson process can be used to describe the emission of a particle by a radioactive source with many atoms, which act independently and decay with a very small probability. Yiren Ding Modeling Rare Events 6 / 48

7 Exponential Distribution Exponential CDF Relationship with the exponential distribution It is a shame to have studied the Poisson distribution without knowing its continuous counterpart: the exponential distribution. In a Poisson arrival process, the number of arrivals during a given time interval is a discrete random variable, but the time between two successive arrivals can take on any positive value and is thus a so-called continuous random variable. Denote by T the random variable representing the waiting time between two successive arrivals in a Poisson process with arrival intensity α. Then T has the cumulative distribution function, P(T t) = 1 e αt, for t 0. (2) It should not surprise you that E(T ) = 1 α. (Why?) Yiren Ding Modeling Rare Events 7 / 48

8 Exponential Distribution Exponential CDF Effect of α on Exponential CDF Yiren Ding Modeling Rare Events 8 / 48

9 Exponential Distribution Exponential CDF Intuitive proof of (2) We show that P(T > t) = e αt, where α is the arrival intensity. Partition the positive side of the t-axis by 0, t, 2 t, 3 t,..., where t > 0 is fixed. (It helps to consider t as seconds.) Denote by X i the number of arrivals in the ith time interval. By Theorem 1, each X i follows the Poisson distribution Po(α t). For any t > 0, let n = t t, and since lim t t t 0 t = t, P(T > t) = lim P(T > n t) = lim [P(X i = 0)] n t 0 n = lim t 0 (e α t ) t t = e αt. It follows that the CDF of the exponential distribution is P(T t) = 1 e αt. Yiren Ding Modeling Rare Events 9 / 48

10 Exponential Distribution Memoryless Property Theorem 2 (Memoryless Property). The exponential distribution has an amazing memoryless property: that is, for every fixed point in time, the waiting period from that point until the first arrival after that point has the same exponential distribution as the inter-arrival times, regardless of how long it has been since the last client arrived before that point in time. Symbolically, this is P(T > s + t T > s) = P(T > t) for all t > 0. (3) Proof. From (2) we have P(T > t) = e αt. Using the definition of conditional probability we have P(T > s + t T > s) = P(T > s + t) P(T > s) = e αt = P(T > t). = e α(s+t) e αs Yiren Ding Modeling Rare Events 10 / 48

11 Exponential Distribution Memoryless Property Example 1. Out in front of Central Station, multiple-passenger taxicabs wait until either they have acquired four passengers or a period of ten minutes have passed since the first passenger stepped into the cab. Passengers arrive according to a Poisson process with an average of one passenger every three minutes. (a) You are the first passenger to get into a cab. What is the probability that you will have to wait ten minutes before the cab gets underway? (b) You were the first passenger to get into a cab and you have been waiting there for five minutes. In the meantime, two other passengers have entered the cab. What is the probability that you will have to wait another five minutes before the cab gets underway? Solution. First note that the arrival intensity of this Poisson process is α = 1 3, the expected number of arrivals per minute. Yiren Ding Modeling Rare Events 11 / 48

12 Exponential Distribution Memoryless Property Example 1 solution To answer part (a), we use Theorem 1 to conclude that the number of arrivals during ten minutes is Poisson distributed with expected value 10α = 10/3. Hence, P(you must wait ten minutes) = P(0, 1, or 2 passengers arrive within the next ten minutes) = e (10/3) (10/3) (10/3)1 (10/3) (10/3)2 + e + e 1! 2! = To answer part (b), we use the memoryless property of the Poisson process. The waiting period before the arrival of the next passenger is exponentially distributed with an expected value of 1/α = 3 minutes, regardless of how long you have waited! Thus the desired probability is e 5α = e 5/3 = Yiren Ding Modeling Rare Events 12 / 48

13 Clustering Effect Clustering Effect of Arrival Times Figure above gives simulated arrival times in the time interval (0, 45) for a Poisson process with arrival intensity α = 1. It is clear that customer arrival times reveal a strong tendency to cluster. To see why, we need to learn more about the exact shape of the exponential distribution. Yiren Ding Modeling Rare Events 13 / 48

14 Clustering Effect Exponential Density Exponential Density Let T denote the inter-arrival time. We know that T is a continuous random variable with a CDF P(T t) = 1 e αt, α > 0. Unlike a discrete random variable, which has a probability mass function, a continuous variable does not have a mass because P(T = t) = 0 for any t R +. (We will prove that later.) Assume that we can only measure inter-arrival time with accuracy up to seconds (unit 1); thus we have a new discrete random variable T with the following probability mass function: P(T = t) = P(t 1 < T t), for t = 1, 2,... It make perfect sense to define it this way because, for example, a measurement of 5 seconds may refer to any of the values in the interval (4, 5] due to rounding errors. Yiren Ding Modeling Rare Events 14 / 48

15 Clustering Effect Exponential Density Exponential Density Since the event {T t} is the union of events {t 1 < T t} and {T t 1}, and using the linearization e x 1 + x for small x, P(T = t) = P(t 1 < T t) = P(T t) P(T t 1) = (1 e αt ) (1 e α(t 1) ) = (e α 1)e αt αe αt Let T (n) denote inter-arrival time measured up to 1 n of a second. This discrete random variable will have the probability mass function P(T (n) = t) = (e α/n 1)e αt 1 n αe αt. A curious reader should already see that αe αt is common in both expressions. This is the exponential density function f (t). Yiren Ding Modeling Rare Events 15 / 48

16 Clustering Effect Exponential Density Exponential Density From physics, you know that Density = In probability, there is a similar version: Mass Volume. Probability Density = Probability Mass Interval Length. Indeed, we see that P(T (n) = t) lim = αe αt = f (t). n 1/n A formal definition of probability density will be presented later. Yiren Ding Modeling Rare Events 16 / 48

17 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 17 / 48

18 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 18 / 48

19 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 19 / 48

20 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 20 / 48

21 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 21 / 48

22 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 22 / 48

23 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 23 / 48

24 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 24 / 48

25 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 25 / 48

26 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 26 / 48

27 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 27 / 48

28 Clustering Effect Exponential Density Probability Density Visualized Yiren Ding Modeling Rare Events 28 / 48

29 Clustering Effect Exponential Density Probability Density Visualized Note that the heights of the bars in the previous graphs do not represent the probability; instead, the areas of the bars do! This begins to resemble a Riemann sum. Indeed, we have P(T t) = t 0 f (x) dx = Hence it should be intuitively clear that t 0 αe αx dx = ( e αx + C) t 0 = 1 e αt. P(s < T t) = t s f (x) dx. The value of f (t) at a single point is not a probability! Yiren Ding Modeling Rare Events 29 / 48

30 Clustering Effect Exponential Density Effect of α on Exponential Density Yiren Ding Modeling Rare Events 30 / 48

31 Clustering Effect Clustering Effect Explained Clustering Effect Explained Denote by F (t) = 1 e αt the cumulative distribution function. Have you noticed the remarkable fact that the probability density function is simply the derivative of the CDF? That is, F (t) = d dt (1 e αt ) = αe αt = f (t). By the definition of the derivative (hope you haven t forgotten), This implies that f (t) = lim t 0 F (t + t) F (t). t P(t < T t + t) f (t) t, for t small. Yiren Ding Modeling Rare Events 31 / 48

32 Clustering Effect Clustering Effect Explained Clustering Effect Explained From its shape, we know that f (t) is largest when t = 0. Hence for a fixed, small value t, the point at which the probability of T being close to t, P(t < T t + t), is largest is at t = 0. This explains the clustering effect because a relatively shorter inter-arrival times occur more frequently than longer ones! Suppose that each day there is a 1% chance of someone being murdered in Shanghai. The occurrence of murder can be modeled by a Poisson process with arrival intensity 3.65 murders per year. Over a period of, say twenty years, the probability of having nine or more murders in a 12-month period is approximately 60%. This contrasts with the probability of nine or more murders in a given 12-month period of only Yiren Ding Modeling Rare Events 32 / 48

33 Clustering Effect Clustering Effect Explained Example 2. In a given city, traffic accidents occur according to a Poisson process with an average of α = 10 accidents per week. In a certain week, seven accidents have occurred. What is the probability that exactly one accident has occurred on each day of that week? Can you explain beforehand why this probability must be small? Solution. Define random variables X 1, X 2,..., X 7 where X i denote the number of accidents occurring in day i, and let X = 7 i=1 X i. The probability we are looking for is P(X 1 = 1,..., X 7 = 1 X = 7). (4) By properties B and C of the Poisson process and Theorem 1, the X i s are i.i.d. Poisson random variables with expected value α/7, and X is Poisson distributed with expected value α. Yiren Ding Modeling Rare Events 33 / 48

34 Clustering Effect Clustering Effect Explained Example 2 solution By the definition of conditional probability, (4) is (P(X 1 ) = 1) (P(X 7 ) = 1) P(X = 7) = ( ) e α 7 7 α 7 e α α 7 7! = 7! 7 7 = This is indeed a small probability. The tendency of Poisson arrivals to cluster explains why this probability is so small. Incidentally, this is the same as the probability of independently drawing seven random numbers and have exactly one random number in each of the seven intervals (0, 1 7 ), ( 1 7, 2 7 ),..., ( 6 7, 1). Thus there is a close relationship between the Poisson arrival process and the uniform distribution, which can be used to simulate the Poisson process in higher-dimensions. (e.g. Stars in the sky) Yiren Ding Modeling Rare Events 34 / 48

35 Clustering Effect Clustering Effect Explained Randomly Generated Points in 2D Yiren Ding Modeling Rare Events 35 / 48

36 Clustering Effect Clustering Effect Explained Randomly Generated Points in 3D Yiren Ding Modeling Rare Events 36 / 48

37 Simulating Poisson Process Uniform Distribution Uniform Distribution A discrete random variable X is said to have a uniform distribution if the probability of all values in the range of X is identical. For example, rolling a fair die yields a uniform distribution, P(i) = 1, i = 1,..., 6. 6 Likewise, we can extend this to the continuous case: A continuous random variable X is said to have a uniform density over the interval (a, b) if its probability density function is given by f (x) = 1 b a for a < x < b (5) By Axiom 2, P(X (a, b)) = 1, which is a mass over the whole interval (b a). By uniformity, the density must be 1/(b a). Yiren Ding Modeling Rare Events 37 / 48

38 Simulating Poisson Process Uniform Distribution Uniform Distribution Yiren Ding Modeling Rare Events 38 / 48

39 Simulating Poisson Process Poisson Process Algorithm Example 3. Let the random variable be given by T = 1 α ln(u), where U is a random number between (0, 1) and α a positive number. Show that T is exponentially distributed. That is, P(T t) = 1 e αt. First note that T is a positive random variable. For any t > 0, P(T t) = P ( 1α ) ln(u) t = P(ln(U) αt) = P(U e αt ) = 1 P(U e αt ). Since U is uniform in (0, 1), P(U e αt ) = e αt P(T t) = 1 e αt dx = e αt, so Hence T is exponentially distributed with expected value 1 α. Yiren Ding Modeling Rare Events 39 / 48

40 Simulating Poisson Process Poisson Process Algorithm Simulating a Poisson Process Example 3 gives a perfect algorithm to simulate Poisson arrivals. (1) Generate a random number u between 0 and 1. (2) Take t = 1 α ln(u) as the inter-arrival time. The following Python code generates 20 arrivals with α = 1. import matplotlib.pyplot as plt import numpy as np from math import log n, alpha, x, X = 20, 1, 0, [] for i in range(n): u = np.random.random() x = x - (1/alpha)*log(u) X.append(x) plt.plot(x, len(x) * [0], "x") plt.show() Yiren Ding Modeling Rare Events 40 / 48

41 Simulating Poisson Process Poisson Process Algorithm Simulation Results Yiren Ding Modeling Rare Events 41 / 48

42 Simulating Poisson Process Poisson Process Algorithm Power of Simulation The previous algorithm is extremely powerful in approximating solutions to problems are not easy to solve analytically. Consider the problem we encountered before: Assume that for each day there is a 1% probability of someone being murdered. What is the probability that somewhere within a time frame of ten years there will be one 12-month period containing nine or more murders? What about twenty years? It is reasonable to model the occurrence of murder by a Poisson process with an arrival intensity α = p/ t = 3.65 murders per year. This problem is ridiculous to solve using analytical method, but using Python we found that the former probability is around 36%, and the latter probablity is around 60%. Sometimes in real life, having an approximation is good enough! Yiren Ding Modeling Rare Events 42 / 48

43 Simulating Poisson Process Poisson Process Algorithm Python Code import numpy as np from math import log numtrials, count = 10000, 0 def PoissonProcess(alpha, time): x, X = 0, [] while x <= time: u = np.random.random() x = x - (1/alpha)*log(u) X.append(x) return X for num in range(numtrials): dupe = False X = PoissonProcess(3.65, 10) for i in range(len(x)-8): if X[i+8]-X[i] <= 1: dupe = True if dupe == True: count += 1 prob = count / numtrials print(prob) Yiren Ding Modeling Rare Events 43 / 48

44 Merging and Splitting of Poisson Process Merging and Splitting of Poisson Process Consider a call center that receive emergency phone calls from two different companies A and B. Calls from company A and calls from company B are independent Poisson processes with arrival intensities α A and α B, respectively. Claim 1: The merging of these two arrival processes gives a Poisson process with arrival intensity α A + α B. Now consider the occurrence of any magnitude of earthquake is a Poisson process with arrival intensity α. There is the probability p of any earthquake being a high-magnitude one, and 1 p being a low-magnitude one. Claim 2: The occurrences of high-magnitude and low-magnitude earthquakes are two independent Poisson processes with arrival intensities αp, and α(1 p), respectively! Yiren Ding Modeling Rare Events 44 / 48

45 Merging and Splitting of Poisson Process Example 4. A piece of radioactive material emits alpha-particles according to a Poisson process with an intensity of 0.84 particle per second. A counter detects each emitted particle, independently, with probability of In a 10-second period the number of detected particles is 12. What is the probability that more than 15 particles were emitted in that period? By Claim 2, the emission of undetected particle is an independent Poisson process with arrival intensity = Denote by X number of undetected particles in the 10-second period. Then X Po(λ) where λ = = The desired probability is then P(X > 3) = 1 3 k= (0.402)k e k! = Yiren Ding Modeling Rare Events 45 / 48

46 Merging and Splitting of Poisson Process Mozart-Requiem Lacrimosa Example 5 (Variant of Coupon Collector Problem). In the casino game of craps two dice are repeatedly rolled. What is the probability that a total of 2, 3, 4, 5, 6, 8, 9, 10, 11, and 12 will be rolled before rolling a seven? Beautiful Solution. A beautiful trick is to imagine the rolling of the two dice follow a Poisson process with arrival intensity α = 1. By Claim 2, this Poisson process can be split into 11 independent Poisson processes with arrival intensities αp j = p j, where p j is the probability of rolling a total of j in rolling of two dice. Fix t, and suppose that a roll of 7 occurs for the first time at time t. By independence of Poisson processes, the probability of rolling a total of 2, 3, 4, 5, 6, 8, 9, 10, 11 and 12 showing up in (0, t) given that the first appearance of 7 is at time t is f (t) = (1 e p 2t ) (1 e p 6t )(1 e p 8t ) (1 e p 12t ). Yiren Ding Modeling Rare Events 46 / 48

47 Merging and Splitting of Poisson Process Mozart-Requiem Lacrimosa Mozart-Requiem Lacrimosa The probability that 7 will occur for the first time in the very small interval (t, t + t) is approximately equal to p 7 e p 7t t, since p 7 e p 7t is the exponential density of rolling a 7. By the Law of Conditional Probability, the desired probability is just a Riemann sum that converges to an improper integral, lim t 0 f (i t)p 7 e p7i t t = i=0 0 f (t)p 7 e p 7t dt. An improper integral f (x) dx is defined as lim f (x) dx. 0 a 0 Plugging the various values of p j and using numerical integration give the value of for the desired probability. Uncontrollable tears of joy fall loudly on the keyboard. a Yiren Ding Modeling Rare Events 47 / 48

48 Merging and Splitting of Poisson Process Mozart-Requiem Lacrimosa Mozart-Requiem Lacrimosa In some casino, the payout of this craps is $175 for each $1 bet. This corresponds to a house edge of 1 ( ) = 7.99%. To cry harder, consider another game: You win if any total is rolled twice before a seven. A win pays 2 for 1. It is left as a homework exercise to show that the win probability is, 1 0 (e p 2t + p 2 te p 2t ) (e p 6t + p 6 te p 6t ) (e p 8t + p 8 te p 8t ) (e p 12t + p 12 te p 12t )p 7 e p 7t dt = This corresponds to a house edge of 1 ( ) = 5.79%. Yiren Ding Modeling Rare Events 48 / 48

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Chapter 1: Revie of Calculus and Probability

Chapter 1: Revie of Calculus and Probability Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid

More information

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014 Lecture 20 Text: A Course in Probability by Weiss 12.1 STAT 225 Introduction to Probability Models March 26, 2014 Whitney Huang Purdue University 20.1 Agenda 1 2 20.2 For a specified event that occurs

More information

Module 8 Probability

Module 8 Probability Module 8 Probability Probability is an important part of modern mathematics and modern life, since so many things involve randomness. The ClassWiz is helpful for calculating probabilities, especially those

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Lecture 4: Bernoulli Process

Lecture 4: Bernoulli Process Lecture 4: Bernoulli Process Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 4 Hyang-Won Lee 1 / 12 Stochastic Process A stochastic process is a mathematical model of a probabilistic

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Lecture 2: Discrete Probability Distributions

Lecture 2: Discrete Probability Distributions Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Poisson population distribution X P(

Poisson population distribution X P( Chapter 8 Poisson population distribution P( ) ~ 8.1 Definition of a Poisson distribution, ~ P( ) If the random variable has a Poisson population distribution, i.e., P( ) probability function is given

More information

CS 237 Fall 2018, Homework 06 Solution

CS 237 Fall 2018, Homework 06 Solution 0/9/20 hw06.solution CS 237 Fall 20, Homework 06 Solution Due date: Thursday October th at :59 pm (0% off if up to 24 hours late) via Gradescope General Instructions Please complete this notebook by filling

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 11: Geometric Distribution Poisson Process Poisson Distribution Geometric Distribution The Geometric

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

1 Inverse Transform Method and some alternative algorithms

1 Inverse Transform Method and some alternative algorithms Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it

More information

STATISTICAL THINKING IN PYTHON I. Probabilistic logic and statistical inference

STATISTICAL THINKING IN PYTHON I. Probabilistic logic and statistical inference STATISTICAL THINKING IN PYTHON I Probabilistic logic and statistical inference 50 measurements of petal length Statistical Thinking in Python I 50 measurements of petal length Statistical Thinking in Python

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Conditional Probability and Bayes

Conditional Probability and Bayes Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition

More information

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15 MA 250 Probability and Statistics Nazar Khan PUCIT Lecture 15 RANDOM VARIABLES Random Variables Random variables come in 2 types 1. Discrete set of outputs is real valued, countable set 2. Continuous set

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Introduction to Probability, Fall 2013

Introduction to Probability, Fall 2013 Introduction to Probability, Fall 2013 Math 30530 Section 01 Homework 4 Solutions 1. Chapter 2, Problem 1 2. Chapter 2, Problem 2 3. Chapter 2, Problem 3 4. Chapter 2, Problem 5 5. Chapter 2, Problem 6

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations

More information

Continuous-Valued Probability Review

Continuous-Valued Probability Review CS 6323 Continuous-Valued Probability Review Prof. Gregory Provan Department of Computer Science University College Cork 2 Overview Review of discrete distributions Continuous distributions 3 Discrete

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

Find the value of n in order for the player to get an expected return of 9 counters per roll.

Find the value of n in order for the player to get an expected return of 9 counters per roll. . A biased die with four faces is used in a game. A player pays 0 counters to roll the die. The table below shows the possible scores on the die, the probability of each score and the number of counters

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis

More information

Computer Applications for Engineers ET 601

Computer Applications for Engineers ET 601 Computer Applications for Engineers ET 601 Asst. Prof. Dr. Prapun Suksompong prapun@siit.tu.ac.th Random Variables (Con t) 1 Office Hours: (BKD 3601-7) Wednesday 9:30-11:30 Wednesday 16:00-17:00 Thursday

More information

Continuous Probability Spaces

Continuous Probability Spaces Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes. Can not assign probabilities to each outcome and add

More information

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Continuous Random Variables

Continuous Random Variables 1 Continuous Random Variables Example 1 Roll a fair die. Denote by X the random variable taking the value shown by the die, X {1, 2, 3, 4, 5, 6}. Obviously the probability mass function is given by (since

More information

MAS1302 Computational Probability and Statistics

MAS1302 Computational Probability and Statistics MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great

More information

Name: Firas Rassoul-Agha

Name: Firas Rassoul-Agha Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE

More information

ST 371 (V): Families of Discrete Distributions

ST 371 (V): Families of Discrete Distributions ST 371 (V): Families of Discrete Distributions Certain experiments and associated random variables can be grouped into families, where all random variables in the family share a certain structure and a

More information

DISCRETE VARIABLE PROBLEMS ONLY

DISCRETE VARIABLE PROBLEMS ONLY DISCRETE VARIABLE PROBLEMS ONLY. A biased die with four faces is used in a game. A player pays 0 counters to roll the die. The table below shows the possible scores on the die, the probability of each

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21 Exponential Distribution

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Senior Math Circles November 19, 2008 Probability II

Senior Math Circles November 19, 2008 Probability II University of Waterloo Faculty of Mathematics Centre for Education in Mathematics and Computing Senior Math Circles November 9, 2008 Probability II Probability Counting There are many situations where

More information

Random Walk on a Graph

Random Walk on a Graph IOR 67: Stochastic Models I Second Midterm xam, hapters 3 & 4, November 2, 200 SOLUTIONS Justify your answers; show your work.. Random Walk on a raph (25 points) Random Walk on a raph 2 5 F B 3 3 2 Figure

More information

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER 2 2017/2018 DR. ANTHONY BROWN 5.1. Introduction to Probability. 5. Probability You are probably familiar with the elementary

More information

Introduction. Probability and distributions

Introduction. Probability and distributions Introduction. Probability and distributions Joe Felsenstein Genome 560, Spring 2011 Introduction. Probability and distributions p.1/18 Probabilities We will assume you know that Probabilities of mutually

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Introduction and Overview STAT 421, SP Course Instructor

Introduction and Overview STAT 421, SP Course Instructor Introduction and Overview STAT 421, SP 212 Prof. Prem K. Goel Mon, Wed, Fri 3:3PM 4:48PM Postle Hall 118 Course Instructor Prof. Goel, Prem E mail: goel.1@osu.edu Office: CH 24C (Cockins Hall) Phone: 614

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Expected Values, Exponential and Gamma Distributions

Expected Values, Exponential and Gamma Distributions Expected Values, Exponential and Gamma Distributions Sections 5.2-5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 14-3339 Cathy Poliak,

More information

Introductory Statistics

Introductory Statistics Introductory Statistics OpenStax Rice University 6100 Main Street MS-375 Houston, Texas 77005 To learn more about OpenStax, visit http://openstaxcollege.org. Individual print copies and bulk orders can

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Discrete Random Variables

Discrete Random Variables Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable

More information

Probability Theory and Applications

Probability Theory and Applications Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

2.1 Elementary probability; random sampling

2.1 Elementary probability; random sampling Chapter 2 Probability Theory Chapter 2 outlines the probability theory necessary to understand this text. It is meant as a refresher for students who need review and as a reference for concepts and theorems

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Expected Values, Exponential and Gamma Distributions

Expected Values, Exponential and Gamma Distributions Expected Values, Exponential and Gamma Distributions Sections 5.2 & 5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13-3339 Cathy

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

The Geometric Distribution

The Geometric Distribution MATH 382 The Geometric Distribution Dr. Neal, WKU Suppose we have a fixed probability p of having a success on any single attempt, where p > 0. We continue to make independent attempts until we succeed.

More information

6 Event-based Independence and Conditional Probability. Sneak peek: Figure 3: Conditional Probability Example: Sneak Peek

6 Event-based Independence and Conditional Probability. Sneak peek: Figure 3: Conditional Probability Example: Sneak Peek 6 Event-based Independence and Conditional Probability Example Example Roll 6.1. a fair Roll dicea dice... Sneak peek: Figure 3: Conditional Probability Example: Sneak Peek Example 6.2 (Slides). Diagnostic

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Marquette University Executive MBA Program Statistics Review Class Notes Summer 2018

Marquette University Executive MBA Program Statistics Review Class Notes Summer 2018 Marquette University Executive MBA Program Statistics Review Class Notes Summer 2018 Chapter One: Data and Statistics Statistics A collection of procedures and principles

More information

3.4. The Binomial Probability Distribution

3.4. The Binomial Probability Distribution 3.4. The Binomial Probability Distribution Objectives. Binomial experiment. Binomial random variable. Using binomial tables. Mean and variance of binomial distribution. 3.4.1. Four Conditions that determined

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

CS 237 Fall 2018, Homework 07 Solution

CS 237 Fall 2018, Homework 07 Solution CS 237 Fall 2018, Homework 07 Solution Due date: Thursday November 1st at 11:59 pm (10% off if up to 24 hours late) via Gradescope General Instructions Please complete this notebook by filling in solutions

More information

Sampling Random Variables

Sampling Random Variables Sampling Random Variables Introduction Sampling a random variable X means generating a domain value x X in such a way that the probability of generating x is in accordance with p(x) (respectively, f(x)),

More information

POISSON RANDOM VARIABLES

POISSON RANDOM VARIABLES POISSON RANDOM VARIABLES Suppose a random phenomenon occurs with a mean rate of occurrences or happenings per unit of time or length or area or volume, etc. Note: >. Eamples: 1. Cars passing through an

More information

(a) Find the mean and standard deviation of X. (5)

(a) Find the mean and standard deviation of X. (5) 1. A student arrives at a school X minutes after 08:00, where X may be assumed to be normally distributed. On a particular day it is observed that 40 % of the students arrive before 08:30 and 90 % arrive

More information

Topic 3 - Discrete distributions

Topic 3 - Discrete distributions Topic 3 - Discrete distributions Basics of discrete distributions Mean and variance of a discrete distribution Binomial distribution Poisson distribution and process 1 A random variable is a function which

More information

Discrete Structures for Computer Science

Discrete Structures for Computer Science Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Exponential, Gamma and Normal Distribuions

Exponential, Gamma and Normal Distribuions Exponential, Gamma and Normal Distribuions Sections 5.4, 5.5 & 6.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park April 3, 2009 1 Introduction Is the coin fair or not? In part one of the course we introduced the idea of separating sampling variation from a

More information

1 The Basic Counting Principles

1 The Basic Counting Principles 1 The Basic Counting Principles The Multiplication Rule If an operation consists of k steps and the first step can be performed in n 1 ways, the second step can be performed in n ways [regardless of how

More information

Chapter 3. Discrete Random Variables and Their Probability Distributions

Chapter 3. Discrete Random Variables and Their Probability Distributions Chapter 3. Discrete Random Variables and Their Probability Distributions 2.11 Definition of random variable 3.1 Definition of a discrete random variable 3.2 Probability distribution of a discrete random

More information

The Exponential Distribution

The Exponential Distribution Connexions module: m46969 1 The Exponential Distribution OpenStax College This work is produced by The Connexions Project and licensed under the Creative Commons Attribution License 3.0 The exponential

More information

1. I had a computer generate the following 19 numbers between 0-1. Were these numbers randomly selected?

1. I had a computer generate the following 19 numbers between 0-1. Were these numbers randomly selected? Activity #10: Continuous Distributions Uniform, Exponential, Normal) 1. I had a computer generate the following 19 numbers between 0-1. Were these numbers randomly selected? 0.12374454, 0.19609266, 0.44248450,

More information

STATISTICAL THINKING IN PYTHON I. Probability density functions

STATISTICAL THINKING IN PYTHON I. Probability density functions STATISTICAL THINKING IN PYTHON I Probability density functions Continuous variables Quantities that can take any value, not just discrete values Michelson's speed of light experiment measured speed of

More information

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence

More information

Math 1313 Experiments, Events and Sample Spaces

Math 1313 Experiments, Events and Sample Spaces Math 1313 Experiments, Events and Sample Spaces At the end of this recording, you should be able to define and use the basic terminology used in defining experiments. Terminology The next main topic in

More information

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014 Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of

More information

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by:

Example 1. The sample space of an experiment where we flip a pair of coins is denoted by: Chapter 8 Probability 8. Preliminaries Definition (Sample Space). A Sample Space, Ω, is the set of all possible outcomes of an experiment. Such a sample space is considered discrete if Ω has finite cardinality.

More information

Tutorial 1 : Probabilities

Tutorial 1 : Probabilities Lund University ETSN01 Advanced Telecommunication Tutorial 1 : Probabilities Author: Antonio Franco Emma Fitzgerald Tutor: Farnaz Moradi January 11, 2016 Contents I Before you start 3 II Exercises 3 1

More information

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions April 6th, 2018 Lecture 19: Special distributions Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters 4, 6: Random variables Week 9 Chapter

More information