Probability Distributions - Lecture 5
|
|
- Lionel Lloyd
- 6 years ago
- Views:
Transcription
1 Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore a few of these functions and their applications. All of the distributions, whether discrete or continuous must be properly normalized so that the area under the distribution function (summed over the individual components if discrete) must equal 1. 2 Stirling s approximation An extremely useful approximation of N! was developed using asymptotic series last semester. Called Stirling s approximation it is; lim N 2π N N+1/2 e N N! = 1 Although the difference between N! and the approximation does not converge, the ratio converges rapidly. This is shown in Table 1. Table 1: The table shows the convergence of Stirling s approximation for N! N N! Approximation Ratio ,628,800 3,598, Stirling s approximation can be improved by adding additional terms. Thus the relation below can be used. (2π) 1/2 N N+1/2 e N+(1/12N+1) < N! < (2π) 1/2 N N+1/2 e N+(1/12N) 1
2 3 Probability distributions 3.1 Bernoulli trials Repeated independent trials that have 2 outcomes, success = p or failure = q, are called Bernoulli trails. From unitarity we have p + q = 1. The probability of a string of n independent trials having k successes might result in a sequence; Prob = ppqpq qpqq = p k q n k In most cases we are interested in the total number of successes or failures and not the order in which they occur. Suppose we want to know the number of successes out of n trials without caring about the ( ordering. ) There will be; n P ossibilities = k each with the probability p k q n k. Thus the probability we seek is ( ) n P B (k, n) = p k k q n k This is the binomial distribution giving the probability of k successes out of n trials. The distribution is obviously discrete. The probabilities p, q cannot change, so either the initial population is very large or sampling is done with replacement. Examples of binomial distributions are shown in Figure 1. The distribution can be easily shown to take the form of a binomial coefficient by considering the probability, P x, that an event occurs x number of times. Write; [p + q] z = p z + z 1! pz 1 q + [p + q] z = P z + P z P 0 = 1 z(z 1) p 2! z 2 q q z An individual term, P x represents the probability that out of z trials, x successes and z x failures occur. Thus ; P x = z! x!(z x)! px q z x The expectation value of the binomial distribution is E(K) = np with variance σ 2 = np(1 p) It has skewness, µ 3 = np(1 p)(1 2p) = σ 2 (1 2p), and Kurtosis; µ 4 = np(1 p)(1 6p + 6p 2 ) = σ 2 (1 6p + 6p 2 ) 2
3 Figure 1: Examples of binomial distributions for various parameters Example Toss a coin 12 times with probability of success 1/2. What is the probability of having 6 successes? P B (6, 12) = 12! 6!6! (1/2)6 (1/2) 6 = 0.23 Only if the number of coin tosses is large will the average probability approach 1/2. However, p and q do not have to be 1/2 and may take any number 0 p, q 1 but p + q = 1 and p, q must remain constant. The Bernoulli model can be applied to any stochastic processes where one is interested in either success or failure. Of course the model is only as good as its representation of empirical data. One might believe that after a long string of coin tosses resulting in heads that the next toss is more likely to be tails, but if this were true, the process would contain a memory of its past and so it would not be random. P(S N ) is a binomial distribution. In the case of p = 1/2, it does not mean that success will occur 1/2 the time. That is there is no tendency for the length of any string of success (or 3
4 failure) to equalize, only that the frequency of the lead is approximately even, see Figure 2. Throws Failure Success Average to 0 Figure 2: A possible example of a sequence of coin tosses with p = 1/ Example Suppose one throws 12 dice and count 1,2 as success. Then P = (1/6)(1/6)(12) = 1/3. Table 2 gives the result of successes for 26,306 throws of the dice. The result looks reasonable, but is statistically bad. A fit would suggest this result occurring once in 10,000 tries. A bias with a probability of P = fits the data. Note that the mean is given by µ = pn = (1/3)(12) = 4 and the variance is σ 2 = pn(1 p) = 8/3 = As another example, we look at the number of phototmultiplier tubes, k, which have a signal within a given time, if the probability of any one having a signal is, p. The total number of tubes is N. This is represented by a binomial distribution given by; ( ) N P(k, N) = p k k q N k for a practical example, choose N = 342 and p = 0.1. The distribution for various values of k is shown in Figure 3. Note that the figure shows the distribution as continuous, but it is defined for discrete values of k. Calculation of the distribution is not as easy as one might suppose, due to the large and small numbers involved. A calculator will overflow or underflow when calculating N! and/or p k, so the calculation must be carefully handled. One 4
5 Table 2: Probability of throwing 12 dice 26,306 times Event Number (k) P(k;12,1/3) Observed P(k;12, way to proceed for very large factorial numbers is to use Stirling s approximation and write the distribution in terms of logarithms. Thus; N! 2π N N+1/2 e N Substitute this into the above equation, write the remaining terms in logarithms and collect terms. The result is; P(k, N) = 1 2π EXP[φ] φ = 2k + N ln( Nq k)p ) + k ln((n ) N k qk Note the average value ie the peak in the distribution) is µ = pn = 34 and the standard deviation is σ = pn(p 1) = 5.5. The figure on the right figure looks like a normal distribution, and this will be discussed below. 3.2 Poisson distribution Suppose a set of Bernoulli trials when n and p is small. We want λ = np such that as n, λ remains constant. Taking appropriate limits we obtain; P P (k, λ) = λk k! e λ This is the Poisson distribution. It is a continuum limit of the binomial distribution. Sup- 5
6 Figure 3: The figure of the left is the binomial distribution of k successes out of 342 samples plotted on a log scale. The figure on the right is the same but plotted on a linear scale 6
7 pose we consider; ( ) n P(0; n, p) = p k 0 q n = (1 p) n ln[p(0; n, p)] = n ln(1 p) = n ln(1 λ/n) Then apply a Taylor expansion; ln[p] = n(λ/n + λ 2 /2n 2 + ) Thus as n ln[p] = λ so ; P(0; n, p) e λ Consider the ratio; P(k; n, p) P(k 1; n, p) = λ (k 1)p k(1 p) Now p = λ/n, so as n p 0, and; P(k; n, p) P(k 1; n, p) λ/n Begin by choosing k = 1 For k = 2; By induction; P(1; n, p) λ P(0; n, p) = λe λ P(2; n, p) λ/2 P(1; n, p) = (λ 2 /2) e λ P(k; n, p) = λk k! e λ In another development suppose we apply Bernouli trials with probability of succcess, p n, but the number of trials is the integer nearest t/(1/n) = nt. This represents sub-divisions of an interval of length t into n divisions. Then take the limit as n and let λ λt. Thus we find that; P(k; λt) = e λt ( (λt)k k! ) 7
8 is the probability of finding exacly, k samples in a fixed interval of length t. The probability of no sample is, P(0; λt) = e λt. The parameter λ determines the density of samples along the t axis. However, the Poisson distribution could be directly derived as a continuum function without using an approximation to the binomial distribution. This will be shown later. Examples of the Poisson distribution are shown in Figure 4 Figure 4: Examples of the Poisson distribution for various parameters To show the normalization; Since; k=0 P k = e λ = k=0 k=0 λ k /k λ k /k e λ = 1 The Poisson distribution has one parameter, λ obtained from the binomial distribution when p = 0 and pn = λ. Again Stirling s approximation will be useful. By considering the ratio of terms and applying Stirling s approximation; P(k; n, p) P(k 1; n, p) = 1 + (n + 1)p k kq 8
9 Table 3: Probability of events for an average rate of 3 events/run Event Number (k) Probability for k probability for < k Probability for > k Table 4: The Poisson probability as a function of a cut on detected photo-electrons in 16 µs for λ = Event Number (k) Probability for k probability for < k Probability for > k The greatest value of the ratio occurs when k = (n+1)p. This is the most probable number of successes. We expect that if S n is the number of successes in n trials the average value is S n /n and should be near p. Thus as n the average number of successes deviates from p by a small number. This is the law of large numbers. Therefore, for the Poisson distribution defined by; P(λ) = λk e λ k! The most probable value (expectation value) is Λ, the variance λ, the skewness is 1/λ, and the Kutosis is 1/λ. As an example we propose to observe a nuclear state. We have data in 3 preliminary runs yielding 1, 5, and 3 events. We need to estimate how many runs to observe 15 events. The average number of events per run is λ = 3. Then fill Table 3. As another example, we presume an average dark current rate in a PMT is 3300 Hz λ t = Apply Poisson statistics to determine the probability of getting a background count in 16 µs. The result is shown in Table 4. 9
10 Table 5: A comparison between the binomial and poisson distributions for the example in the text Event Number Binomial Poisson N k N k /100 k P(k;100,1/100) P(k;1) Table 5 compares the binomial and poisson probability distributions for N k, the number of 2-number sets in 100 pairs of random numbers in which the pair (7,7) appears k times Radioactive decay Suppose λ is the average rate of decay of a radioactive nuclide. The average number of decays in a time, t, is λt. If the interval is small then λ dt is the probability of decay in dt. The probability of no decays in dt is (1 λ dt). For x particles to decay; P x (t + dt) = P 1 (dt)p (x 1) (t) + [1 P 1 (dt)]p x (t) The first term on the right is the probability of one decay in dt multiplied by the probability of (x 1) decays in t. The second term is the probability of no decays in dt and x decays in t. Thus; P x (t + dt) = λ dt P x 1 (t) + (1 λ dt)p x In the limit as dt 0; dp x dt = λ[p x 1 (t) P x (t)] The above has solution; P x = (λt)x x! e λt 10
11 3.3 Normal distribution The normal distribution is an analytical approximation to the binomial distribution when x = n and the average is λ = xp. From the binomial distribution let x keeping p fixed and δ/n 0. Where; δ = x np (deviation from the mean) (n x) = nq δ Use Stirling s approximation; N! = (2π) 0.5 N N+0.5 e N and substitute into the binomial distribution, re-arranging terms. The result gives; dp N (λ) dx = 1 2πσ e (x λ)2 /(2σ 2 ) An example of the normal distribution is shown in Figure 6. Here λ is the mean value (expectation value), σ 2 is the variance, and both the Skewness and Kutosis equal zero. Integration over < x < gives P N = 1 and when x = σ then ; 1 σ 2πσ σ dxe (x λ)2 /(2σ 2) = erf( 1/2) = In the above, erf(x) is the error function evaluated at x. An example of the error function is shown in Figure 6 The mean or expectation value is E(x) = µ with variance, σ 2. The skewness and kurtosis vanish. The normal distribution is the most important distribution for statistical analysis and usually written as N(µ, σ 2 ). Note that σ is not the half-width at half-height which is 1.176σ and the distribution extends equally above and below the mean, so if the mean is not large, there can be negative probabilities, which of course is non-physical, meaning that the Normal approximation to the binomial of Poisson distributions is invalid. The probability content of various intervals of the variable x σ µ = z is; P( 1.64 < z < 1.64) = 0.9 P( 1.96 < z < 1.96) = 0.95 P( 2.58 < z < 2.58) =
12 Figure 5: An example of a normal distribution showing the defining parameters Figure 6: Graphs of the error integral for various parameters P( 3.29 < z < 3.29) =
13 As an example suppose a sample of 100 events with p = 0.3. This is a relatively small sample, but is used here to note deviations from the binomial distribution. This is illustrated in Table 5 where S n represents the number of successes as one moves away from the expected value of Error integral erf(x) = 2 x π 0 dt e t2 erfc(x) = 1 erf(x) The error function erf(x), is the probability that a number drawn from a normal distribution is less than y. Prob(y < x Gaussian) = (1/2)[1 + erf( x µ 2σ )] 5 Relationship between the distributions p n λ Probability of success # events average Binomial p 0 n p n λ 8 Poisson n p n λ 8 λ 8 Gaussian λ 8 Figure 7: The connection between the binomial, Poisson, and normal distributions 13
14 Table 6: A comparison between the binomial and normal distributions for N = 100 and p = 0.3 Success range Binomial normal Percent Error 9 S n S n S n S n S n S n S n S n S n S n S n S n S n S n The relationship of various distributions to the binomial distribution is shown in Figure 7. The error using the normal distribution is small if npq is large. If n is large and p is small the Poisson distribution is more appropriate. If λ is large, neither the Poisson or normal distribution is appropriate. The mean value is the arithmetic average of a data set. It is defined as the expectation value. np m = E(X) = (1/N) N i X i The breadth of a statistical fluctuation about the mean is measured by the standard deviation. σ 2 = N (X i m) 2 i=1 Then for the normal distribution, σ 2 = npq. For the Poisson distribution σ 2 = λ = m. Of course neither m nor λ can be accurately determined. The distribution of the mean values is more normal than the parent. As another example consider a comparison of the Poisson distribution to the normal distribution. Suppose we choose N = 10 8 and p = Then npq = 100. The Poisson distribution agrees with the binomial distribution P B (k; 10 8, 10 6 ). This is now compared to the normal distribution in Table 7. 14
15 Table 7: A comparison between the Poisson distribution P P (k; 100) and normal distributions P N (µ, σ) Success range Poisson normal P(85, 90) P(90, 95) P(95, 105) P(90, 110) P(110, 115) P(115, 120) As a final example, suppose a telephone exchange to serve 2000 phones and wishes to connnect to another exchange. Obviously one does not run 2000 cables. Choose the number of lines, N such that with some probability at least one will connect. Suppose that only 1% of calls are dropped and each call lasts 2 minutes. Set the probability that a trunk line is required at p = 1/30. Assume random requirements for the lines. There are then 2000 Berrnoili trials with p = 1/30, and the number of samples N is chosen such that the probability of more than N successes is less than 0.1. For a Poisson approximation take λ = 2000/30 = For N = 87 this results in S n = For the normal approximation, (N 1/2 np)/npq = Statistical distributions In statistical mechanics we divide phase space into a large number of cells, N, and select, k, particles to be placed in a cell, N > k. In this way the state of a system is defined by a random distribution of k particles in N cells. Assume all N k arrangements have equal probabilities. This results in a Maxwell-Boltzmann distribution. The particles can be distributed in k! ways so that the N cells contain L 1, L 2, L N particles where N L i = k. There are then k! ways to choose the k particles and L i! ways to reorder the particles in the i th cell. The number of possibilities is then; N P = k! L 1!L 2! L N! For equal probability of all arrangements, the probability that N cells contain L 1, L 2, L N indistinguishable particles is; P M = N P /N k i 15
16 7 Example Consider 6 particles among which is distributed 9 units of energy, as illustrated in figure 8. There are 26 possible distributions. 4 Figure 8: Examples of the distribution of 6 particles into 10 energy states states The 3 distributions in the figure have different numbers of statistical possibilities as calculated by Maxwell-Boltzmann statistics, but the sum of the energies of all the states is the same total energy. The number of possible statistical arrangements is written above each distribution for the Maxwell-Boltzmann and Bose-Einstein distributions. Perhaps it is easiest to find the total number of possible energy states by tabulation. There are 26 such possibilities assuming indistinguishable rearrangements within each sub-energy level. There are N k total ways to distribute the particles, so the probability of a Maxwell-Boltzmann distribution is; P MB = [ k! L 1!L 2 L N! ] N k The average occupancy is the sum over the 26 distributions of the number of particles in a given energy state divided by 26. On the other hand, if the particles are indistinguishable and described by Bose-Einstein statistics, all the distributions have equal probability. Thus not all N k arrangements are equally probable. In the example above, each macro-arrangement is equally probable having an energy proportional to N. It is assigned a probability given by (the inverse of the number 16
17 of representations); P BE = ( N + k 1 k ) 1 Low energy states are more probable in Bose-Einstein than for Maxwell-Boltzmann statistics. To obtain the distribution of particles as a function of energy, the average population of each energy state must be evaluated. Using ( ) Fermi-Dirac statistics only one particle can occupy a state, so k N and, P FD = 1 N. In this case the statistical sample is obtained by specifying which of the N cells k contain a particle and then re-ordering the particles. Statistical mechanics finds the maximum number densities for a given energy that is derived from these expressions. Figure 9 compares the Maxwell-Boltzmann distribution to the Bose-Einstein distribution. ρ MB = 1 e (ǫ µ)/kt ρ BE = 1 e (ǫ µ)/kt + 1 ρ FD = 1 e (ǫ µ)/kt 1 Maxwell-Boltzmann Bose-Einstein Fermi-Dirac 8 Example Consider the distribution given in the diagram; ( ) The number of cells is 5 ( N = 5) and the number of particles is 3 (k = 3). Bose-Einstein Statistics ( ) ( ) N + k 1 7 = k 3 P BE = 1/35 = = 35 Fermi-Dirac 17
18 Figure 9: A comparison of the Maxwell-Boltzmann and Bose-Einstein distributions ( N k ) = ( 5 3 ) = 10 P FD = 1/10 = The 10 possible arrangements are graphically shown below. ( ); ( ); ( ); ( ); ( ); ( ) ( ); ( ); ( ); ( ) Maxwell-Boltzmann Statistics Number of possibilities N 5 ; Number of choices k!; Equal probabilities per state 1/N 5. P MB = k! N k = For the state illustrated by the graph above, the number of indistinguishable states are given in the graphs below. (1 2 3 ); (2 3 1 ); (3 1 2 ); 18
19 (2 1 3 ); (1 3 2 ); (3 2 1 ) 9 Interval distribution The interval distribution uses the Poisson distribution to represent the waiting time between successive events of any random process. We have already used this to represent radioactive decay. The mean rate of events is λ. The probability of no events in time, t is; P 0 = (λt)0 0! e λt The probability of k events in t is then; P k = (λt)k k! e λt 9.1 Example Consider 2 models of a set of coin tosses. In model (1) the probability of p is unknown and in model (2) p = 1/2. The likelihoods are obtained from the binomial distribution. ( ) N P(k; N, p) = p k k (1 p) N k = C n p k (1 p) N k This is written as P rob(d p, M) which represents the conditional probability of observing the data, D, conditioned by the model, M, and probability, p. In the case of Model (1) this is; ( ) N P(k; N, p) = (1/2)N 1/2 To compare the models we want to invert the probabilities. Thus we want Prob(p D, M). Thus we use Bayes theorem; Prob(p D, M) = Prob(D p, M) Prob(P, M) Prob(p, M) We do not know the initial probability for Model (1), P rob(p, M). Suppose we choose this by assuming; P(p M 1 ) = (2n + 1)! (k!) 2 p n (1 p) n In the above, k = 0 represents no information and k = 10 represents incomplete, but informed information. The first case gives a flat prior. The second case gives a peak for a 19
20 probability of 1/2 with a spread in the distribution. The inverted probability is; Prob(p D, M 1 ) = C np k+n (1 p) N k+n Prob(B, M 1 ) Of course this may be iterated using the posterior Prob(p D, M 1 ) as the next prior, Prob(P, M). This means the probability is adjusted based on the results of each coin toss. Then to compare models; R = Prob(M 1 D) Prob(M 2 D) By substitution of Bayes theorem in the above gives the result. For a large number of tries, this can be analytically evaluated in approximation; R = (2n + 1)!(k + n)!(n k + n)! (n!) 2 (N + 2n + 1)! 2 N Note this becomes independent of n for large N, k. The Figure 10 shows convergence to the binomial distribution no matter the initial (prior) probability which model is used). 10 Chi-square distribution The Chi-squared distribution is important in statistics as it is used for testing the goodness of fit of a theory to data and for testing various theories. Let X i be a set of N normally distributed variables with means, µ i, and variances, σ 2 i. Write the variant, Z i, in the form; Z i = X i µ i σ i. Denote χ 2 = N Z 2 i. Then the probability density becomes; P(χ 2 /2) = (χ 2 /2) N/2 1 e χ2 /2 Γ(N/2) Here, N is the number of degrees of freedom. It is the number of normal variates whose squares are added to reproduce the χ 2. Examples of χ 2 distributions are shown in Figure 11. The shape of the curve depends on N. The expectation value is N, the variance is 2N, the Skewness is 2(2/N) 1/2, and the Kurtosis is 12/N. Asymptotically, χ 2 (N) becomes a normal distribution. In statistical applications we are usually interested in the shaded area to the right of a particular value of χ 2, as shown in the Figure 11. This must be obtained numerically from tables, for example Table 12. However, for large N the normalized chi-square, χ 2 /N is normally distributed with a variance of 1. 20
21 Figure 10: A series of figures showing that the convergence to a probability based on a model and data. The result as n is independent on the initial prior 11 Exponential Distribution The exponential distribution has a probability density function of the form; P(X) = λ e λx The expectation value is 1/λ, the variance is (1/λ) 2, the Skewness is 2 and the Kurtosis 21
22 Figure 11: The figuure on the left shows χ 2 plots for various degrees of freedom, N, vs the number of random variables, X. The figure on the right shows the χ 2 probability for N = 6 as a function of X, indicating the area for X > 7 which represents the error in a fit. is 6. The exponential distribution has no memory. If and event occurs up to time y, the probability of no event in a subsequent time is independent of y. The distribution applies to particles arriving at a counter. We obtained this distribution previously when considering the Poisson distribution. The probability of N events occurring in a time interval t is; P(N t) = (λt)n N! e λt The probability of no events is therefore, e λt. 12 Breit Wigner Distribution This distribution is identical to the Cauchy distribution, and is important as an approximation to a resolution function. This distribution is a pathalogical case as it does not have an expectation value, variance, Skew, or Kurtosis unless the distribution has a cutoff, since the integrals defining these properties do not converge. The distribution takes the form; P(X) = (1/π)[ Γ Γ 2 + (X X 0 ) 2 22
23 In the above, X 0, and Γ represent the location and scale parameters, Γ being the half-width at half-height, respectively. 23
24 Figure 12: A table of χ2 probability for various N. Note that χ2 /N 1 gives close to the standard deviation (ie for large N) 24
Probability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationPAS04 - Important discrete and continuous distributions
PAS04 - Important discrete and continuous distributions Jan Březina Technical University of Liberec 30. října 2014 Bernoulli trials Experiment with two possible outcomes: yes/no questions throwing coin
More information18.175: Lecture 13 Infinite divisibility and Lévy processes
18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationLecture 2 Binomial and Poisson Probability Distributions
Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James
More informationIntroduction. Probability and distributions
Introduction. Probability and distributions Joe Felsenstein Genome 560, Spring 2011 Introduction. Probability and distributions p.1/18 Probabilities We will assume you know that Probabilities of mutually
More informationGuidelines for Solving Probability Problems
Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics. Probability Distributions Prof. Dr. Klaus Reygers (lectures) Dr. Sebastian Neubert (tutorials) Heidelberg University WS 07/8 Gaussian g(x; µ, )= p exp (x µ) https://en.wikipedia.org/wiki/normal_distribution
More informationWeek 4: Chap. 3 Statistics of Radioactivity
Week 4: Chap. 3 Statistics of Radioactivity Vacuum Technology General use of Statistical Distributions in Radiation Measurements -- Fluctuations in Number --- distribution function models -- Fluctuations
More informationII. Probability. II.A General Definitions
II. Probability II.A General Definitions The laws of thermodynamics are based on observations of macroscopic bodies, and encapsulate their thermal properties. On the other hand, matter is composed of atoms
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationLecture 12. Poisson random variables
18.440: Lecture 12 Poisson random variables Scott Sheffield MIT 1 Outline Poisson random variable definition Poisson random variable properties Poisson random variable problems 2 Outline Poisson random
More information18.175: Lecture 17 Poisson random variables
18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationStatistics and data analyses
Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =
More informationUNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson
JUST THE MATHS UNIT NUMBER 19.6 PROBABILITY 6 (Statistics for the binomial distribution) by A.J.Hobson 19.6.1 Construction of histograms 19.6.2 Mean and standard deviation of a binomial distribution 19.6.3
More informationBMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution
Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random
More informationRandom variables. DS GA 1002 Probability and Statistics for Data Science.
Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities
More informationPhysics 6720 Introduction to Statistics April 4, 2017
Physics 6720 Introduction to Statistics April 4, 2017 1 Statistics of Counting Often an experiment yields a result that can be classified according to a set of discrete events, giving rise to an integer
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationStatistics, Data Analysis, and Simulation SS 2017
Statistics, Data Analysis, and Simulation SS 2017 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2017 Dr. Michael O. Distler
More informationLecture 2. Binomial and Poisson Probability Distributions
Durkin, Lecture 2, Page of 6 Lecture 2 Binomial and Poisson Probability Distributions ) Bernoulli Distribution or Binomial Distribution: Consider a situation where there are only two possible outcomes
More informationLecture 16. Lectures 1-15 Review
18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More informationStatistics, Data Analysis, and Simulation SS 2013
Statistics, Data Analysis, and Simulation SS 213 8.128.73 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 23. April 213 What we ve learned so far Fundamental
More informationTopic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.
Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationSpecial distributions
Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions A
Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationSome Statistics. V. Lindberg. May 16, 2007
Some Statistics V. Lindberg May 16, 2007 1 Go here for full details An excellent reference written by physicists with sample programs available is Data Reduction and Error Analysis for the Physical Sciences,
More informationIntroduction to Statistics and Error Analysis
Introduction to Statistics and Error Analysis Physics116C, 4/3/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationSignal Processing - Lecture 7
1 Introduction Signal Processing - Lecture 7 Fitting a function to a set of data gathered in time sequence can be viewed as signal processing or learning, and is an important topic in information theory.
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationChapter 4. Repeated Trials. 4.1 Introduction. 4.2 Bernoulli Trials
Chapter 4 Repeated Trials 4.1 Introduction Repeated indepentent trials in which there can be only two outcomes are called Bernoulli trials in honor of James Bernoulli (1654-1705). As we shall see, Bernoulli
More informationsuccess and failure independent from one trial to the next?
, section 8.4 The Binomial Distribution Notes by Tim Pilachowski Definition of Bernoulli trials which make up a binomial experiment: The number of trials in an experiment is fixed. There are exactly two
More informationMFM Practitioner Module: Quantitative Risk Management. John Dodson. September 23, 2015
MFM Practitioner Module: Quantitative Risk Management September 23, 2015 Mixtures Mixtures Mixtures Definitions For our purposes, A random variable is a quantity whose value is not known to us right now
More informationData, Estimation and Inference
Data, Estimation and Inference Pedro Piniés ppinies@robots.ox.ac.uk Michaelmas 2016 1 2 p(x) ( = ) = δ 0 ( < < + δ ) δ ( ) =1. x x+dx (, ) = ( ) ( ) = ( ) ( ) 3 ( ) ( ) 0 ( ) =1 ( = ) = ( ) ( < < ) = (
More informationPhysics Sep Example A Spin System
Physics 30 7-Sep-004 4- Example A Spin System In the last lecture, we discussed the binomial distribution. Now, I would like to add a little physical content by considering a spin system. Actually this
More information1 INFO Sep 05
Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually
More informationProbability - Lecture 4
1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More informationProbability Distributions
The ASTRO509-3 dark energy puzzle Probability Distributions I have had my results for a long time: but I do not yet know how I am to arrive at them. Johann Carl Friedrich Gauss 1777-1855 ASTR509 Jasper
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationChapter 6 Continuous Probability Distributions
Math 3 Chapter 6 Continuous Probability Distributions The observations generated by different statistical experiments have the same general type of behavior. The followings are the probability distributions
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen
More informationPROBABILITY DISTRIBUTION
PROBABILITY DISTRIBUTION DEFINITION: If S is a sample space with a probability measure and x is a real valued function defined over the elements of S, then x is called a random variable. Types of Random
More informationIntroduction to Probability
Introduction to Probability Salvatore Pace September 2, 208 Introduction In a frequentist interpretation of probability, a probability measure P (A) says that if I do something N times, I should see event
More informationMath Bootcamp 2012 Miscellaneous
Math Bootcamp 202 Miscellaneous Factorial, combination and permutation The factorial of a positive integer n denoted by n!, is the product of all positive integers less than or equal to n. Define 0! =.
More informationExam 3, Math Fall 2016 October 19, 2016
Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,
More informationChapter 3 Single Random Variables and Probability Distributions (Part 1)
Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationIntroduction to Statistical Data Analysis Lecture 3: Probability Distributions
Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationStochastic Models in Computer Science A Tutorial
Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction
More informationRandom processes and probability distributions. Phys 420/580 Lecture 20
Random processes and probability distributions Phys 420/580 Lecture 20 Random processes Many physical processes are random in character: e.g., nuclear decay (Poisson distributed event count) P (k, τ) =
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationn px p x (1 p) n x. p x n(n 1)... (n x + 1) x!
Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)
More informationBrandon C. Kelly (Harvard Smithsonian Center for Astrophysics)
Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationAdvanced Herd Management Probabilities and distributions
Advanced Herd Management Probabilities and distributions Anders Ringgaard Kristensen Slide 1 Outline Probabilities Conditional probabilities Bayes theorem Distributions Discrete Continuous Distribution
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationPHYS 445 Lecture 3 - Random walks at large N 3-1
PHYS 445 Lecture 3 - andom walks at large N 3 - Lecture 3 - andom walks at large N What's Important: random walks in the continuum limit Text: eif D andom walks at large N As the number of steps N in a
More informationProbability Density Functions
Probability Density Functions Probability Density Functions Definition Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f (x) such that
More informationCHAPTER 14 THEORETICAL DISTRIBUTIONS
CHAPTER 14 THEORETICAL DISTRIBUTIONS THEORETICAL DISTRIBUTIONS LEARNING OBJECTIVES The Students will be introduced in this chapter to the techniques of developing discrete and continuous probability distributions
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationTom Salisbury
MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom
More informationIntroduction to Statistics. By: Ewa Paszek
Introduction to Statistics By: Ewa Paszek Introduction to Statistics By: Ewa Paszek Online: C O N N E X I O N S Rice University, Houston, Texas 2008 Ewa Paszek
More informationDiscrete Binary Distributions
Discrete Binary Distributions Carl Edward Rasmussen November th, 26 Carl Edward Rasmussen Discrete Binary Distributions November th, 26 / 5 Key concepts Bernoulli: probabilities over binary variables Binomial:
More informationProbability Distribution
Economic Risk and Decision Analysis for Oil and Gas Industry CE81.98 School of Engineering and Technology Asian Institute of Technology January Semester Presented by Dr. Thitisak Boonpramote Department
More informationECE 313 Probability with Engineering Applications Fall 2000
Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =
More informationChapter 1. Sets and probability. 1.3 Probability space
Random processes - Chapter 1. Sets and probability 1 Random processes Chapter 1. Sets and probability 1.3 Probability space 1.3 Probability space Random processes - Chapter 1. Sets and probability 2 Probability
More informationn(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3
Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different
More informationRandom Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution
Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).
More informationfunctions Poisson distribution Normal distribution Arbitrary functions
Physics 433: Computational Physics Lecture 6 Random number distributions Generation of random numbers of various distribuition functions Normal distribution Poisson distribution Arbitrary functions Random
More informationEE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002
EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationIntroduction to Probability Theory for Graduate Economics Fall 2008
Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function
More informationChapter 2: Random Variables
ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationChap 2.1 : Random Variables
Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is
More informationX 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:
nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the
More informationBinomial random variable
Binomial random variable Toss a coin with prob p of Heads n times X: # Heads in n tosses X is a Binomial random variable with parameter n,p. X is Bin(n, p) An X that counts the number of successes in many
More informationGiven a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)
1 16.584: Lecture 2 : REVIEW Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] if AB = P[A B] = P[A] + P[B] P[AB] P[A] = 1 P[A
More information{ p if x = 1 1 p if x = 0
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationRadioactivity. PC1144 Physics IV. 1 Objectives. 2 Equipment List. 3 Theory
PC1144 Physics IV Radioactivity 1 Objectives Investigate the analogy between the decay of dice nuclei and radioactive nuclei. Determine experimental and theoretical values of the decay constant λ and the
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationStatistics, Probability Distributions & Error Propagation. James R. Graham
Statistics, Probability Distributions & Error Propagation James R. Graham Sample & Parent Populations Make measurements x x In general do not expect x = x But as you take more and more measurements a pattern
More informationDiscrete Distributions
A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose
More information