MAS1302 Computational Probability and Statistics

Size: px
Start display at page:

Download "MAS1302 Computational Probability and Statistics"

Transcription

1 MAS1302 Computational Probability and Statistics April 23, 2008

2 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great deal: the Uniform distribution on the interval (0,1). This is denoted by U(0,1) and denotes the random variable which has probability density function (p.d.f.): { 1 0 < x < 1 f (x) = 0 otherwise

3 3.1 The U(0,1) Distribution f (x) x Up to now, we have used U(0,1) random variables to simulate from many discrete distributions, and taken the U(0,1) random variables for granted. In this chapter on continuous random behaviour, we will start by thinking how we simulate from this most simple of continuous distributions.

4 Example Many Casio calculators provide sequences of random numbers using the function SHIFT RAN# = : A typical sequnce of these might be: 0.909, 0.304, 0.966, 0.837, 0.151,... These are in fact, U(0,1) random numbers recorded to 3 decimal places. So how does the calculator generate them?

5 Solution (Slide 1 of 2) The key is in the accuracy to which the numbers are generated! Generating U(0,1) random variables precisely would be very difficult! (Impossible?!?) However we can get three decimal places by generating a random integer x from the set {0,1,2,...,999}, with all outcomes equally likely, and then put u = x/1000. Then u is a simulation from a U(0,1) random variable recorded to three decimals.

6 Solution (Slide 2 of 2) The simulation of x is easy, e.g. using a congruential generator with m = 1000 and maximum period, as we saw in Section 2.4. Some calculators give the Uniform random numbers to 10 decimal places rather than 3. This could be achieved with a congruential generator with m = 10 10, so simulating x from the set of integers: {0,1,2,..., }. Setting u = x/10 10 would give a pseudo random number from the U(0,1) distribution, recorded to 10 decimals.

7 As we will see, R will take care of all the details for us, and will supply many U(0,1) random variables to a good degree of accuracy very quickly. We have already seen how continuous U(0,1) random variables can be used to generate discrete probability distributions (Sections ). Soon we will see how U(0,1) random variables can be used to generate other continuous probability distributions. First, however, we will see how simulation from U(0,1) random variables can sometimes be useful in its own right.

8 3.2 Monte Carlo Methods Introduction The term Monte Carlo is used to decribe any simulation study which involves random numbers. The name is a reference to the famous Monte Carlo Casino in Monaco, where repetition of random events is the order of the day! What is a simulation study? For our purposes, a simulation study is any study where we study the properties of a system using random numbers. We have already seen several simulation studies in Section 2.11, when we verified some rules for probability distributions, and checked how well some data fitted a probability model.

9 Often, simulation studies involve estimating the probability of an event (e.g. in Example , the probabilities of each outcome were estimated). In general, suppose we do an experiment which has the event A as one possible outcome. We would like to estimate the probability of A, denoted by P(A). Then by repeatedly simulating the experiment, it is simple to estimate P(A), the probability of the event A, using P F (A), where: P F (A) = No. of times A occurs Number of times experiment simulated. Here P F (A) is so-called because it is a Frequency estimate of P(A).

10 Why does this work? Well suppose we denote the number of times we simulate the experiment by n, then P F (A) has the following important property: as n, then P F (A) P(A). This means that the larger our number of simulations, the more accurate will be our estimate of P(A).

11 Example Suppose U 1 and U 2 are two independent U(0,1) random variables. Let Y = U 1 + U 2. Then Y is a random variable on the interval (0,2). I.e. Y is a random variable which can take any value in the range (0,2). Question 1: What is P(Y < 1.5), the probability that Y is less than 1.5? Question 2: (harder) What is the probability density function (p.d.f.) of Y? Question 3: What is the cumulative distribution function (c.d.f.) of Y?

12 Monte Carlo solution to Question 1 (Slide 1 of 2) Let s formulate Question 1 as a Monte Carlo problem: Here A is the event {Y < 1.5}. We want P(A). Suppose we simulate n observations from the random variable Y, and we observe the number of occasions, r, that A happens, i.e. r = #{Y < 1.5}. Then our Monte Carlo estimate of P(A) is: P F (A) = r/n.

13 Monte Carlo solution to Question 1 (Slide 2 of 2) This was carried out in R with n = 1000, and we obtained r = 861. Hence we estimate: P F (A) = 861/1000 = 0.861, where P F (A) is our estimate of P(A) = P(Y < 1.5).

14 In this case, it is possible to answer Question 1 mathematically ( analytically ). This solution will be exactly right, so it will enable us to see how well we did with the Monte Carlo solution (see Problems Class).

15 Analytical solution to Question 1 (Slide 1 of 5) Recall Y = U 1 + U 2 where U 1 and U 2 are independent U(0,1) random variables. The distribution of Y is on the interval (0,2), but it is not uniform, and it is not obvious what it is. However, we can answer Question 1 by thinking geometrically!

16 Analytical solution to Question 1 (Slide 2 of 5) In the plot of U 2 against U 1 below, the shaded region is completely identified with the event A = {Y < 1.5}. U U 1 Given that the total area of the square is 1, the area of the shaded region is the probability we want!

17 Analytical solution to Question 1 (Slide 3 of 5) It is clear from the diagram that the answer is: P(A) = 7/8 = N.B. This answer is exactly right, so we can look back and see how well our Monte Carlo estimate P F (A) compared with the true value of P(A). We got as our estimate of Not bad!

18 Analytical solution to Question 1 (Slide 4 of 5) Note that the area of the shaded region was easy to calculate using common sense arguments, but it could also have been expressed as an integral. Consider the line which forms the upper boundary in the sketch. It has the equation 1 0 < u 1 1/2, u 2 = f (u 1 ) = 1.5 u 1 1/2 < u 2 < 1, 0 otherwise.

19 Analytical solution to Question 1 (Slide 5 of 5) But note that the probability we want is actually the area under the curve, given by the integral 1 0 f (u 1 )du 1 = 7/8 = Check you can do the integration here!

20 An analytical solution to Question 2 in Example is beyond the scope of this course, but if you fancy a challenge, it is not *too* hard to guess the answer! When you have the answer to Question 2 (soon), you *should* be able to answer Question 3! The solution to Question 1 leads us on to think more formally about using Monte Carlo methods as a method for evaluating integrals...

21 3.3 Approximation of integrals Suppose we have a complicated function f defined on the interval (0, 1), and also suppose that f (x) (0, 1). 1 f(x) We would like to evaluate 1 0 f (x)dx, but the function may be too complicated to integrate.

22 We can find an approximate answer by noting that the integral is equal to the area under the curve, and using Monte Carlo methods. We design an experiment which would work in general, even if the function was defined on a general range (a,b), and if f (x) (0,c), for any positive value c.

23 We generate a random data point from a simulation grid. Let A = {the data point lies below the curve}. Then so that P(A) = area under curve area of simulation grid, b a f (x)dx = [area under curve] = P(A)[area of simulation grid].

24 Example Consider the complicated function we saw earlier, defined on (0,1), and with f (x) (0,1): 1 f(x) Estimate 1 0 f (x)dx using Monte Carlo methods.

25 Solution (Slide 1 of 3) 1. We simulate n data points from the simulation grid; here this is the unit square (0, 1) (0, 1) f(x)

26 Solution (Slide 2 of 3) 2. Each of the coordinates x and y are generated using a U(0,1) random variable. Note the fact that A = {data point (x,y) lies below the curve} = {y < f (x)}. 3. If r points lie below the curve, then the proportion r/n = P F (A) is an estimate of P(A), and thus 1 0 f (x)dx = P(A)[area of simulation grid] = P(A) 1 = P(A) P F (A) = r/n.

27 Solution (Slide 3 of 3) 4. In our case, from the figure above we estimate: 1 0 f (x)dx = r/n = 14/20 = 0.7.

28 Example It is well known that there is no possible analytical expression for values of the Normal cumulative distribution function. For example, suppose we are interested in the standard Normal distribution with mean 0 and variance 1. Consider a random variable Z with this distribution, i.e. Z N(0, 1). Now suppose we want to evaluate P(0 Z 1).

29 Example (continued) It is very easy to write this probability down as an integral: 1 0 φ(x)dx = π exp( x 2 /2)dx. Here φ(x) is the standard Normal probability density function evaluated at x. The problem is that it is impossible to evaluate this integral, because indefinite integrals of the form exp(ax 2 )dx cannot be solved. It is easy, however, to formulate this as a Monte Carlo problem...

30 Monte Carlo Solution (Slide 1 of 3) The sketch of the Normal probability density function (p.d.f.) below illustrates the required probability P(0 Z 1) as the shaded area. Once again, the simulation grid is the unit square. p.d.f x

31 Monte Carlo Solution (Slide 2 of 3) As before, we simulate X and Y as independent U(0,1) random variables, so that we obtain an observation (x,y) each time. The event A is that the point (x,y) lies in the shaded area. This is true if y < φ(x), so here A = {y < φ(x)}. If we simulate n pairs (x,y), and set r = #{y < φ(x)}, then our estimate of the required integral is r/n.

32 Monte Carlo Solution (Slide 3 of 3) This was carried out in R with n = 1000, and we obtained r = 346. Hence we estimate: π exp( x 2 /2)dx 346/1000 = The true value of this integral is known to be (to three decimals, obtained using numerical methods). So once again we see that Monte Carlo simulation has given us a fairly accurate estimate!

33 Remarks 1 The accuracy of the estimate is dependent on the value of n. Remember P F (A) P(A) as n. 2 We will use this method in Assignment 2 to estimate the circular constant π. 3 Later we will use Monte Carlo methods with non uniform random variables. 4 Notice that in Example Question 1 the outcome of interest was a probability, whereas in Examples and it was an integral.

34 3.4 Non Uniform Continuous Random Variables In order to simulate many real life systems we will need to generate non uniform continuous random variables. As with the various discrete random variables we generated in Section 2, we will use the continuous Uniform U(0,1) distribution as our starting point. We will use a method called the Inversion Method, also known as the Table Look-Up Method, to transform a U(0,1) random variable to any other continuous distribution we desire. This works as long as we can evaluate the cumulative distribution function (c.d.f.) of the continuous distribution in question.

35 Theorem: The Inversion Method Suppose we wish to simulate a continuous random variable X with cumulative distribution function F(x). (Remember the notation is: F(x) = P(X x) ). Now let U be a U(0,1) random variable. Then X = F 1 (U) has the required distribution. Proof: P(X x) = P(F 1 (U) x) = P(U F(x)) = F(x).

36 Example Suppose X is a random variable with an exponential distribution with mean 2. How do we simulate from X?

37 Solution (Slide 1 of 3) The Exponential distribution with mean (1/λ) has c.d.f. F(x) = 1 e λx. We simulate a value u from a U(0,1) distribution, and set F(x) = 1 e λx = u. Now rearranging this to make x a function of u, we get: x = 1 λ log(1 u) = F 1 (u). (1) So given any simulated value u, we use Equation (1) with λ = 1/2 (giving mean equal to 2, as required), to obtain x, our simulation from X, where X Exp(1/2) as required.

38 Solution (Slide 2 of 3) Putting λ = 1/2, equation (1) becomes: x = 2log(1 u). So, for example, if we generated u = 0.785, then we would obtain: x = F 1 (u) = 2log( ) = It is easiest to see why this works by visualising the process...

39 Solution (Slide 3 of 3) The inversion method for the Exp(0.5) distribution with u = 0.785: u = F(x) 1 F(x) = 1 e x/ x = F (u) 10

40 Summary So, in order to simulate a value x from a random variable X, we need to take the following two steps: 1 Simulate a random number u from the U(0,1) distribution. 2 Set x = F 1 (u), where F is the c.d.f. of X, and F 1 is therefore the inverse c.d.f. This is all O.K., but what happens if we can t write down the function F 1?

41 E.g. Suppose Y is a random variable with a Normal N(µ,σ 2 ) distribution. How do we simulate from Y? This time we have a problem, because we can t write down the c.d.f. Φ(x) for the Normal distribution, so we can t invert it explicitly to obtain the inverse c.d.f. Φ 1 (u). Fortunately tables for Φ 1 (u) exist (e.g. see Neave s tables), but these are only for the standard Normal distribution, i.e. for the random variable Z, where Z N(0,1).

42 So starting with a simulated observation u from the Uniform U(0,1) distribution, we can look up our simulated observation z from our standard Normal random variable Z N(0,1). However, remember we wanted an observation y from Y N(µ,σ 2 ). This is now easy to generate if we remember that Y µ σ N(0,1). We put so that Y µ σ = Z, Y = µ + σz N(µ,σ 2 ). (2) By plugging our simulated observation z from Z into Equation (1) we immediately obtain our simulated value y from Y.

43 Example A. Convert the simulated value u = from U(0,1) to an observation from Y N(100,15 2 ). B. Convert the simulated value u = from U(0,1) to an observation from Y N(36,2 2 ).

44 Solution (Slide 1 of 2) A. We take the following steps: 1 Look up z = Φ 1 (u) to obtain z = Φ 1 (0.904) = Calculate y = µ + σz = =

45 Solution (Slide 2 of 2) B. We take the following steps: 1 Look up z = Φ 1 (u) to obtain z = Φ 1 (0.414) = Calculate y = µ + σz = = N.B. Check you can repeat this process! Use the following observations: a) u = 0.687; b) u =

46 3.5 Exponential Variables and Poisson Processes In Example we saw how to simulate psuedo random numbers from the Exponential Distribution with parameter λ (mean 1/λ). The Exponential Distribution is important in statistics, often as a model for the inter arrival times between events. For example, consider a queue for a cashier at a bank. Now suppose that the chance of a customer arriving at any instant in time is constant over time. Then the distribution of the time interval between arrivals is a random variable having the Exponential Distribution.

47 Now suppose we start counting at some arbitrary time zero, then the time to the first event, E 1, and the time interval between the (i 1) th and the i th events, E i, are all independent Exponential random variables with some parameter λ: E i exp(λ), independently, i = 1,2,3,... The p.d.f. of the E i is f (x) = λe λx, and the mean inter arrival time is 1/λ.

48 Now consider the number of events N occurring in a unit time interval. If the chance of an arrival at any instant in time is constant over time, then N Poisson(λ). That is to say, the distribution of the number of events in a unit time-interval is Poisson, with parameter (mean) equal to λ. Moreover, the distribution of the number of events in a time interval of length t 1 is Poisson(λt 1 ). Finally, the distribution of the number of events in a non overlapping time interval of length t 2 is indpendently distributed Poisson(λt 2 ).

49 Summary So, for any process where events occur so that the chance of an occurence remains constant through time, and the average number of events per unit time is λ, the following properties automatically hold: 1 The inter event times are independently Exponentially distributed with parameter λ, mean interval 1/λ. 2 The number of events in a time interval of length t is Poisson(λt), independently of the number of events in any other non overlapping time interval. Any process like this is called a Poisson Process in time. The fact that events occur at a constant rate means it is called a homogeneous Poisson Process.

50 Examples of Poisson Processes in Time (Slide 1 of 1) Goals scored in a football match.(may not be homogeneous e.g. the chance of a goal in any given instant may increase towards the end of the match). Crashes of a computer system. Emission of particles from a radioactive source. Instances of a river level being higher than a crucial safety level x. Instances of asteroid impact explosions greater than y megatonnes. (In the study of the history of the earth, such occurences are linked to mass extinctions of species.)

51 3.6 Simulation of Random Systems Now that we are able to simulate from any continuous random variable for which we can write down the c.d.f., we are in a position to simulate much more interesting/realistic examples than we have thus far. We will see that systems with quite complex behaviour can be simulated by building them from components which are simple random variables.

52 Example Queueing system Suppose we want to model a queue for a specialist service at a bank. A really simple model might be as follows (units are minutes): Inter-arrival times are distributed Exp(0.05), i.e. Exponential with mean 20. Service times are distributed N(25,5 2 ). Consider the following questions: 1 What is the distribution of the number of customers waiting at time t = 120 minutes? 2 In a 6-hour day, what is the mean customer waiting time? 3 In a 6-hour day, what is the distribution of the maximum customer waiting time?

53 Example Queueing system (continued) In principle we could find mathematical solutions to these problems. In practice this would be extremely difficult. However, very good approximate answers can be obtained by repeated simulations of six hours of queue behaviour. This is easily achieved by simulating a sequence of arrival times from the Exp(0.05) distribution, and a sequence of service times from the N(25,5 2 ) distribution, and then constructing a time line, on which we monitor the size of the queue. Let s see how this works by simulating the first hour of queue time...

54 Example Birth Death Process Consider a population of organisms (e.g. bacteria) evolving through time: The initial population size is denoted by N(0). Any individual splits ( birth ) with rate λ. Any individual dies with rate µ. Once again the rules are very simple to set up, but the evolution of the population size (N(t)) over time (t > 0) can be complex. It is easy to study this process by simulation, but very hard to do the maths.

55 Example Birth Death Process (continued) It is not immediately obvious how we can simulate from such a process, because any individual could split or die at any time. However, one solution is as follows: It can be shown that the time from one event at time t (birth or death) to the next event is a random variable E with distribution E Exp[(λ + µ)n(t)]. The next event is either a birth or a death with probabilities λ/(λ + µ) and µ/(λ + µ) respectively.

56 Solution (Slide 1 of 1) This gives rise to a 2 step simulation process from one event to the next: 1 Simulate the time to the next event from an Exp[(λ + µ)n(t)] random variable. 2 Simulate the type of event from a Bernoulli random variable with success probability λ/(λ + µ) corresponding to a birth, otherwise the event is a death. (We will be simulating birth death processes in Assignment 2!)

57 Example Progress of an epidemic Here we consider the evolution of the population of infected individuals as a birth-death process. Here the birth process corresponds to new individuals becoming infected. The death process corresponds to infected individuals being removed from the population (by being cured, or hospitalized, or dying). In order to stop the epidemic, it is necessary to get the death rate higher than the birth rate.

58 Case study: The SARS epidemic of 2003 SARS (Severe Acute Respiratory Syndrome) is a viral disease which has had one major epidemic (Nov Jul. 2003). Denote the rate of infections at time t by λ t. Denote the rate of removal from infectious state at time t by µ t.

59 Case study: The SARS epidemic of 2003 In the early stages (small t), we had λ t > µ t, and the epidemic spread. As time passed (large t), measures were taken to reduce infectivity, and hence achieve λ t < µ t. The epidemic was reduced and halted. Note that one reason this was achievable was because SARS is most infectious during the symptomatic period. It would be much more difficult for diseases such as influenza, which are most infectious pre-symptoms!

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21 Exponential Distribution

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Probability and Statistics Concepts

Probability and Statistics Concepts University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

Common ontinuous random variables

Common ontinuous random variables Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Continuous Random Variables

Continuous Random Variables 1 Continuous Random Variables Example 1 Roll a fair die. Denote by X the random variable taking the value shown by the die, X {1, 2, 3, 4, 5, 6}. Obviously the probability mass function is given by (since

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 13: Normal Distribution Exponential Distribution Recall that the Normal Distribution is given by an explicit

More information

Expected Values, Exponential and Gamma Distributions

Expected Values, Exponential and Gamma Distributions Expected Values, Exponential and Gamma Distributions Sections 5.2 & 5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13-3339 Cathy

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Gamma and Normal Distribuions

Gamma and Normal Distribuions Gamma and Normal Distribuions Sections 5.4 & 5.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 15-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Independent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring

Independent Events. Two events are independent if knowing that one occurs does not change the probability of the other occurring Independent Events Two events are independent if knowing that one occurs does not change the probability of the other occurring Conditional probability is denoted P(A B), which is defined to be: P(A and

More information

Guidelines for Solving Probability Problems

Guidelines for Solving Probability Problems Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Exponential, Gamma and Normal Distribuions

Exponential, Gamma and Normal Distribuions Exponential, Gamma and Normal Distribuions Sections 5.4, 5.5 & 6.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Expected Values, Exponential and Gamma Distributions

Expected Values, Exponential and Gamma Distributions Expected Values, Exponential and Gamma Distributions Sections 5.2-5.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 14-3339 Cathy Poliak,

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Chapter 1: Revie of Calculus and Probability

Chapter 1: Revie of Calculus and Probability Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

6.1 Randomness: quantifying uncertainty

6.1 Randomness: quantifying uncertainty 6.1 Randomness: quantifying uncertainty The concepts of uncertainty and randomness have intrigued humanity for a long time. The world around us is not deterministic and we are faced continually with chance

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Poisson processes Overview. Chapter 10

Poisson processes Overview. Chapter 10 Chapter 1 Poisson processes 1.1 Overview The Binomial distribution and the geometric distribution describe the behavior of two random variables derived from the random mechanism that I have called coin

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Continuous Probability Spaces

Continuous Probability Spaces Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes. Can not assign probabilities to each outcome and add

More information

2905 Queueing Theory and Simulation PART IV: SIMULATION

2905 Queueing Theory and Simulation PART IV: SIMULATION 2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random

More information

2 Random Variable Generation

2 Random Variable Generation 2 Random Variable Generation Most Monte Carlo computations require, as a starting point, a sequence of i.i.d. random variables with given marginal distribution. We describe here some of the basic methods

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

More information

Chapter 4. Continuous Random Variables

Chapter 4. Continuous Random Variables Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Special distributions

Special distributions Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions

More information

Continuous random variables

Continuous random variables Continuous random variables CE 311S What was the difference between discrete and continuous random variables? The possible outcomes of a discrete random variable (finite or infinite) can be listed out;

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Will Landau. Feb 21, 2013

Will Landau. Feb 21, 2013 Iowa State University Feb 21, 2013 Iowa State University Feb 21, 2013 1 / 31 Outline Iowa State University Feb 21, 2013 2 / 31 random variables Two types of random variables: Discrete random variable:

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

CIVL Continuous Distributions

CIVL Continuous Distributions CIVL 3103 Continuous Distributions Learning Objectives - Continuous Distributions Define continuous distributions, and identify common distributions applicable to engineering problems. Identify the appropriate

More information

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras (Refer Slide Time: 00:23) Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras Lecture - 22 Independent

More information

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes. Closed book. Two pages of hand-written notes, front and back. No calculator. 6 minutes. Cover page and four pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS Review of PROBABILITY DISTRIBUTIONS Hideaki Shimazaki, Ph.D. http://goo.gl/visng Poisson process 1 Probability distribution Probability that a (continuous) random variable X is in (x,x+dx). ( ) P x < X

More information

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014 Lecture 20 Text: A Course in Probability by Weiss 12.1 STAT 225 Introduction to Probability Models March 26, 2014 Whitney Huang Purdue University 20.1 Agenda 1 2 20.2 For a specified event that occurs

More information

ISyE 3044 Fall 2017 Test #1a Solutions

ISyE 3044 Fall 2017 Test #1a Solutions 1 NAME ISyE 344 Fall 217 Test #1a Solutions This test is 75 minutes. You re allowed one cheat sheet. Good luck! 1. Suppose X has p.d.f. f(x) = 4x 3, < x < 1. Find E[ 2 X 2 3]. Solution: By LOTUS, we have

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

ECE 313 Probability with Engineering Applications Fall 2000

ECE 313 Probability with Engineering Applications Fall 2000 Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =

More information

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

The Binomial distribution. Probability theory 2. Example. The Binomial distribution Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Line of symmetry Total area enclosed is 1

Line of symmetry Total area enclosed is 1 The Normal distribution The Normal distribution is a continuous theoretical probability distribution and, probably, the most important distribution in Statistics. Its name is justified by the fact that

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park April 3, 2009 1 Introduction Is the coin fair or not? In part one of the course we introduced the idea of separating sampling variation from a

More information

Stochastic Models in Computer Science A Tutorial

Stochastic Models in Computer Science A Tutorial Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 2004 EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) GRADUATE DIPLOMA, 004 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

STAT 516: Basic Probability and its Applications

STAT 516: Basic Probability and its Applications Lecture 4: Random variables Prof. Michael September 15, 2015 What is a random variable? Often, it is hard and/or impossible to enumerate the entire sample space For a coin flip experiment, the sample space

More information

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #0 Prof. Young-Han Kim Thursday, April 4, 04 Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei). Time until the n-th arrival. Let the random variable N(t) be the number

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Stat 100a, Introduction to Probability.

Stat 100a, Introduction to Probability. Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous

More information

Uniform random numbers generators

Uniform random numbers generators Uniform random numbers generators Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2707/ OUTLINE: The need for random numbers; Basic steps in generation; Uniformly

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Chapter 4. Continuous Random Variables 4.1 PDF

Chapter 4. Continuous Random Variables 4.1 PDF Chapter 4 Continuous Random Variables In this chapter we study continuous random variables. The linkage between continuous and discrete random variables is the cumulative distribution (CDF) which we will

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah Math 5040 1 Stochastic Processes & Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a randomnumber

More information

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Lecture 4. Continuous Random Variables and Transformations of Random Variables Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda

More information

GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review

GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review Solé Prillaman Harvard University January 28, 2015 1 / 54 LOGISTICS Course Website: j.mp/g2001 lecture notes, videos, announcements Canvas: problem

More information

Continuous Probability Distributions. Uniform Distribution

Continuous Probability Distributions. Uniform Distribution Continuous Probability Distributions Uniform Distribution Important Terms & Concepts Learned Probability Mass Function (PMF) Cumulative Distribution Function (CDF) Complementary Cumulative Distribution

More information

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Examples of random experiment (a) Random experiment BASIC EXPERIMENT Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the

More information

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y. Solutions to Homework Set #3 (Prepared by Yu Xiang). Time until the n-th arrival. Let the random variable N(t) be the number of packets arriving during time (0,t]. Suppose N(t) is Poisson with pmf p N

More information