Probability Distributions

Size: px
Start display at page:

Download "Probability Distributions"

Transcription

1 Chapter Probability Distributions In this chapter we will describe the most common probability distribution functions encountered in high energy physics.. Discrete Distributions.. Combinatorial Given the importance of combinatorial calculus for what follows, we summarize in this section a few of the main results. Typically we will be talking about sequences of objects: a distinction has to be made on whether we care or not about the order of the element in a sequence. If we care about the order we talk about permutations, otherwise we talk about combinations (permutations are ordered combinations). Example: take the set of letters {abc}. Thesequences{cab} and {bac} are considered equivalent combinations but distinct permutations. Permutations with repetitions: pick r objects from a set of n and put them back each time. The number of permutations (ordered sequences) is: n r (..) Example: A byte is a sequence of 8 bits (0/): the number of permutations with repetitions is 8 = 56. A lock with three digits (form 0 to 9) has 0 3 permutations. Permutations without repetitions: pick r objects from a set of n and don t put them back. At each pick you ll have less objects to choose from, so the number number of permutations is reduced with respect to the permutations with repetitions. The number of permutations (ordered sequences) is: n! (..) (n r)! 5

2 6 CHAPTER. PROBABILITY DISTRIBUTIONS Example: take all permutations without repetitions of the 5 cards in the deck: 5! (first pick you choose among 5 cards, second pick among 5 etc...). Take all permutations of the first 4 picks in a deck of 5 cards: = 5!/(5 4)! (first you pick from 5, then from 5, then from 50, then form 49). Combinations without repetitions: pick r objects from a set of n and don t put them back. At each pick you ll have less objects to choose from as for permutations, but this time all sequences that di er only by their order are considered to be the same. The number of combinations (non-ordered sequences) is the number of permutations corrected by the the factor that describes the number of ordered sequences (i.e. r!): n! n (n r)! r! = (..3) r These numbers are the so-called binomial coe (p + q) n = nx r=0 cients, which appear in the binomial theorem: n p r q n r (..4) r It follows that the number of combinations extracting r objects from n or (n-r) from n is the same! Example: lotto (six-numbers lottery game): 6 numbers are extracted (without putting them back) from a set of 90. The order of the extraction is irrelevant. The probability to win (when all tickets are sold) is / n r (which is o(0 7 )). Combinations with repetitions: pick r objects from a set of n and put them back. As in the case of permutations with repetition but this time without considering the order. (n + r )! n + r = (..5) (n r)!r! r Example: take r-scoops from n-icecream flavours. You can take them all the same or repeat them as you like. Factorial For large n, thestirling formula can be used to approximate n!: ln n! (n +/) ln n n + ln p (..6) n n p n! n (..7) e The first term in the second line, (n/e) n, is called the zero-th approximation, whereas the whole term in the above equation is the first approximation. The factorial n! can be extended for non-integer arguments x by the gamma function (x):

3 Bernoulli trials 7.. Bernoulli trials x! = Z 0 u x e u du = (x + ) (..8) (x + ) = x (x) (..9) A Bernoulli trial is an experiment with only two outcomes (success/failure or /0) and where success will occur with constant probability p and failure with constant probability q = p. Examples are again the coin toss, or the decay of K + into either µ + or all other channels. The random variable r {0, } is the outcome of the experiment and its p.d.f. (see Fig... is: f(r; p) =p r q ( r) (..0) The p.d.f. is simply the probability for a single experiment to give success/failure. The first two central moments of the distribution are: µ = p (..) V (r) =p( p) (..) Figure..: Bernoulli trials distribution for a fixed p = 0.3 and 000 trials...3 Binomial Given n Bernoulli trials with a probability of success p, the binomial distribution gives the probability to observe r successes and, consequently, n r failures independently of the order

4 8 CHAPTER. PROBABILITY DISTRIBUTIONS with which they appear. The random variable is again r but this time r {0,n}, i.e. the maximum is given when all trials give a success. The p.d.f. is: n P (r; n, p) = p r ( p) n r. (..3) r Equation..3 can be motivated in the following way: the probability that we get the result Sinthefirstr attempts (and not in the last n r attempts) is given by p r ( p) n r ;butthis n sequential arrangement is only one of a total of r possible arrangements. The distribution for di erent values of the parameters is plotted in figure... The important properties of the binomial distribution are: It is normalized to, i.e. P n r=0 P (r) =. The mean of r is <r>= P n r=0 r P (r) =np. The variance of r is V (r) =np( p). The binomial distribution (like several others we will encounter) has the reproductive property. IfX is binomially distributed as P (X; n, p) and Y is binomially distributed (with the same probability p) as P (Y ; m, p), then the sum is binomially distributed as P (X + Y ; n + m, p). Example: what is the probability to get out of 0 coin tosses 3 times a head? Solution: P (3; 0, 0.5) = ( 0.5) 0 3 = 0! 3!7! =0. Example: a detector with 4 layers has an e ciency per layer to detect a traversing particle of 88%. To reconstruct the complete track of the particle, we need at least three hits (i.e. three out of the four layers have to detect the particle). What is the probability to reconstruct the track? We need at least 3 hits (i.e. 3 or 4), so we have to sum the probability to have 3 hits to the probability to have 4 hits: P (r 3; n =4,p=0.88) = P (r = 3; n =4,p=0.88) + P (r = 4; n =4,p=0.88) = = What if we have 3 or 5 layers? For 3 layers = P (r = 3; n =3,p=0.88) = For 5 layers = P (r 3; n =5,p=0.88) = = How big should be the e ciency be for 3, 4 or 5 layers to have an overall probability for track reconstruction greater than 99%? Given r 3, n =3, 4, 5 and P 0.99 we need to get p. This can be done calculating P (r 3; n, p) for the various values of n solving for p. For n =3wefindp = 0.996; n =4! p = 0.940; n =5! p = Multinomial Distribution The precedent considerations can directly be generalized for the multidimensional problem. Assume we have n objects of k di erent types, and r i is the number of objects of type i. The

5 Poisson Distribution 9 Figure..: The binomial distribution for a fixed p =0.4 and di erent values for n. n! number of distinguishable arrangements is then given by r!r! r k!. If we choose now randomly r objects (putting them back every time), then the probability of getting an arrangement of r i objects of type i is given by p r pr pr k k. The overall probability is therefore simply the probability of our arrangement, multiplied with the number of possible distinguishable arrangements: r! P (r,..,r k ; N,p,p k )= p r r!r!r 3! r k! pr pr k k. (..4) This distribution is called the multinomial distribution and it is what describes the probability to have r i events in bin i of an histogram with n entries. The corresponding properties are:..5 Poisson Distribution <r i >= np i and V (r) =np i ( p i ). (..5) The Poisson p.d.f. applies to the situations where we detect events but do not know the total number of trials. An example is a radioactive source where we detect the decays but do not detect the non-decays.

6 0 CHAPTER. PROBABILITY DISTRIBUTIONS The distribution can be obtained as a limit of the binomial: let be the probability to observe a radioactive decay in a period T of time. Now divide the period T in n time intervals T = T/n small enough that the probability to observe two decays in an interval is negligible. The probability to observe a decay in T is then /n, while the probability to observe r decays in the period T is given by the binomial probability to observe r events in n trials each of which has a probability /n. n! r n r P r; n, = (..6) n (n r)! r! n n Under the assumption that n>>rthen: n! (n r)! = n(n )(n )...(n r + ) nr (..7) and n r n! e for n! (..8) n n and replacing this in the binomial expression in eq...6 we obtain the Poisson p.d.f: P (r; )=. (..9) r! The Poisson distribution can so be seen as the limit of the binomial distribution when the number n of trials becomes very large and the probability p for a single event becomes very small, while the product pn = remains a (finite) constant. It gives the probability of getting r events if the expected number (mean) is. Properties of the Poisson distribution: It is normalized to : P r=0 P (r) =e P r=0 The mean <r>is : <r>= P r=0 r e r r! =. The variance is V (r) =. ( r+ s) r e r r! = e e + = P (r + s; r, s) = ( r+ s)(r+s) e (r+s)! : the p.d.f. of the sum of two Poisson distributed random variables is also Poisson with equal to the sum of the s of the individual Poissons Example An historical example is the number of deadly horse accidents in the Prussian army. The fatal incidents were registered over twenty years in ten di erent cavalry corps. There was a total of fatal incidents, and therefore the expectation value per corps per year is given by = /00 = 0.6. The probability that no soldier is killed per year and corps is P (0; 0.6) = e /0! = To get the total events (of no incidents) in one year and per corps, we have to multiply with the number of observed cases (here 00), which yields = The total statistics of the Prussian cavalry is summarized in table.., in agreement with the Poisson expectation. The Poisson distribution is very often used in counting experiments: This example is mentioned for the first time in the book from L. von Bortkiewicz in the year 898: Das Gesetz der kleinen Zahlen.

7 .. CONTINUOUS DISTRIBUTIONS Fatal incidents per corps and year Reported incidents Poisson distribution Table..: The total statistics of deadly accidents of Prussian soldiers. Number of particles which are registered by a detector in the time interval t, iftheflux and the e ciency of the detector are independent of time and the dead time of the detector is su ciently small, such that. Number of interactions caused by an intense beam of particles which travel through a thin foil. Number of entries in a histogram, if the data are taken during a fixed time interval. Number of flat tires when traveling a certain distance, if the expectation value f lats/distance is constant. Some counter-examples, in which the Poisson distribution cannot be used: The decay of a small amount of radioactive material in a certain time interval, if this interval is significant compared to the lifetime. The number of interactions of a beam of only a few particles which pass through a thick foil. In both cases we have the event rate is not constant (in the first it decreases with time, in the second with distance) and therefore the Poisson distribution cannot be applied. The Poisson p.d.f. requires that the events be independent. Consider the case of a counter with a dead time of µsec. This means that if a second particle passes through the counter within µsec after one which was recorded, the counter is incapable of recording the second particle. Thus the detection of a particle is not independent of the detection of other particles. If the particle flux is low, the chance of a second particle within the dead time is so small that it can be neglected. However, if the flux is high it cannot be. No matter how high the flux, the counter cannot count more than 0 6 particles per second. In high fluxes, the number of particles detected in some time interval will not be Poisson distributed. Picture..3 shows the Poisson distribution for some values of.. Continuous Distributions.. Uniform Distribution The probability density function of the uniform distribution in the interval [a, b] is given by (see Fig...4):

8 CHAPTER. PROBABILITY DISTRIBUTIONS Figure..3: The Poisson distribution for di erent values of. f(x) = ( b a if a apple x apple b 0 else. (..0) The expectation value and the variance are given by <x> = Z b a x b a dx = (a + b) (..) Var(x) = (b a) (..) Example Consider a detector built as a single strip of silicon with a width of mm. If a charged particle hits it, the detector reads otherwise zero (binary readout). What is the spacial resolution of the detector? Estimating the resolution as the variance of the corresponding uniform distribution, we get 90µm.

9 Gaussian or Normal Distribution 3 Figure..4: The uniform distribution... Gaussian or Normal Distribution The Gaussian or normal distribution is probably the most important and useful distribution we know. It is widely used in practice 3. The probability density function is f(x; µ, )= p e (x µ). (..3) The Gaussian distribution is described by two parameters: the mean value µ and the variance,where denotes the standard deviation. By substituting z =(x µ)/ we obtain the so-called normal or standardized Gaussian distribution: N(0, ) = p e z /. (..4) It has an expectation value of zero and standard deviation. Properties of the normal distribution are: It is normalized to : R + P (x; µ, )dx = µ is the expectation value of the distribution: R + xp (x; µ, )dx = µ, as well as (being a symmetric distribution) its mode and median. is the standard deviation and the variance is : R + (x µ) P (x; µ, )dx =. If X and Y are two independent r.v.s distributed as f(x; µ x, x) and f(y; µ y, y) thenz=x + Y is distributed as f(z; µ z, z) withµ z = µ x + µ y and z = x + y. C.F. Gauss did not discovery it all alone. Independently, Laplace and de Moivre knew about this distribution. 3 A legend says that Gauss did describe the size of bread loaves in the city of Königsberg with the normal distribution.

10 4 CHAPTER. PROBABILITY DISTRIBUTIONS Some useful integrals, which are often used when working with the Gaussian function: Z + Z + 0 Z + Z + 0 Z + e ax dx = p /a xe ax dx = a x e ax dx = p /a a x n+ e ax dx = n! a n+ x n+ e ax dx = 0, for all odd values of n Normalverteilung, CDF Normaldichte, PDF Figure..5: The standardized Gaussian distribution. On the top graph the cumulative distribution function is shown, on the lower graph its probability distribution function. Here are some numbers for the integrated Gaussian distribution: 68.7% of the area lies within ± around the mean µ % lies within ± % lies within ±3.

11 Gaussian or Normal Distribution 5 90% of the area lies within ± % lies within ± % lies within ± % lies within ±3.90. (x) can also be expressed by the so-called error func- The integrated Gaussian function tion erf(x): (x) = p Z x e (t µ) / dt (..5) erf(x) = p Z x e t dt (..6) 0 => (x) = +erf( x µ p ) (..7) The Full Width Half Maximum (FWHM) is very useful to get a quick estimate for the width of a distribution and for the specific case of the Gaussian we have: FWHM = p ln=.355. (..8) The gaussian distribution is the limiting case for several other p.d.f. s (see Fig...6). This is actually a consequence of the central limit theorem (CLT) (discussed in section.4). Figure..6: Limiting cases. N-dimensional Gaussian Distribution The N-dimensional Gaussian distribution is defined by f(x; µ, V )= ( ) N/ exp V / (x µ)t V (x µ). (..9)

12 6 CHAPTER. PROBABILITY DISTRIBUTIONS Here, x and µ are column vectors with the components x,...,x N and µ,...,µ N,respectively. The transposed vectors x T and µ T are the corresponding row vectors and V is the determinant of the symmetric N N covariance matrix V. The expectation values and the covariances are given by: hx i i = µ i V (x i )=V ii cov(x i,x j )=V ij In the simplified case of a two-dimensional Gaussian distribution we can write f(x,x ; µ µ,,, ) = p exp ( ) " x µ x µ # x µ x µ exp. We will come back to the specific case of the gaussian distribution in multiple dimensions in Ch.3 when talking about the error matrix...3 Distribution Assume that x,x,,x n are independent random variables, which obey a (standardized) Gaussian distribution with mean 0 and variance. Then the joint p.d.f. is: " ny # xi µ f(x; µ, ) = p i exp (..30) i= i i " nx # xi µ i Y n = exp p (..3) i i= i i= Then the variable (n) defined as:a (n) = nx i= xi i µ i (..3) being a function of random variables is itself a random variable distributed as a distribution with n degrees of freedom. The probability density is given by (see Fig...7): / (n) =f( ; n) = ( ) n/ e (n/) n/. (..33) The (n) p.d.f. has the properties: mean = n variance = n mode = n- for n and 0 for n apple

13 Distribution 7 Figure..7: The distribution for di erent degrees of freedom. reproductive property: n +n = n + n = (n + n ) Since the expectation of (n) isn, the expectation of (n)/n is. The quantity (n)/n is called reduced. For n!, (n)! N( ; n, n) becomes a standardized normal distribution. When using the -distribution in practice, the approximation by a normal distribution is already su - cient for n 30. To understand the notation take the particular case of the for degree of freedom. Let z =(x µ)/ so that the p.d.f. for z is N(z;0, ) and the probability that z apple Z apple z + dz is: f(z)dz = p e z dz (..34) Let Q = Z. (We use Q here instead of to emphasize that this is the variable.) This is not a one-to-one transformation because both +Z and Z go into +Q. The probability that Q is between q and q + dq is the sum of the probability that Z is between z and z + dz around z = p q, and the probability that Z is between z and z dz around z = p q. The Jacobian is: J ± = d(±z) dq = ± p q (..35) so f(q)dq = p e q ( J + + J )dq = p e q ( dq p q + dq p q )dq = p q e q dq (..36) Replacing for Q we have: () = p e (..37) (careful, the same symbol is used for the random variable and the p.d.f.).

14 8 CHAPTER. PROBABILITY DISTRIBUTIONS Figure..8: The log-normal distribution for di erent values of the defining parameters...4 Log-Normal Distribution If y obeys a normal distribution with mean µ and standard deviation, then it follows that x = e y obeys a log-normal distribution. This means that ln(x) is normal distributed (see Fig...8): f(x; µ, )= p x e (ln x µ) /. (..38) The expectation value and the variance are given by: <x> = e (µ+ ) (..39) Var(x) = e (µ+ ) (e ) (..40) The log-normal distribution is typically used when the resolution of a measurement apparatus is composed by di erent sources, each contributing a (multiplicative) amount to the overall resolution. As the sum of many small contributions of any random distribution converges by the central limit theorem to a Gaussian distribution, so the product of many small contributions is distributed according to a log-normal distribution. Example Consider the signal of a photomultiplier (PMT), which converts light signals into electric signals. Each photon hitting the photo-cathode emits an electron, which gets accelerated by an electric field generated by an electrode (dynode) behind. The electron hits the dynode and emits other secondary electrons which gets accelerated to the next dynode. This process if repeated several times (as many as the number of dynodes in the PMT). At every stage the number of secondary electrons emitted depends on the voltage applied. If the amplification per step is a i, then the number of electrons after the k th step, n k = k i=0 a i,is approximately log-normal distributed.

15 Exponential Distribution 9 Figure..9: The exponential distribution for di erent values of (right: linear, left: logarithmic scale)...5 Exponential Distribution The exponential distribution (see Fig...9) is defined for a continuous variable t (0 apple t apple) by: f(t, ) = e t/. (..4) The probability density is characterized by one single parameter. The expectation value is h i = Z 0 te t/ dt =. (..4) The variance is Var(t) =. An example for the application of the exponential distribution is the description of the properdecay-time decay (t) of an unstable particles. The parameter corresponds in this case to the mean lifetime of the particle...6 Gamma Distribution The gamma distribution (see Fig...0) is given by: f(x; k, )= k xk e x. (..43) (k) The expectation value is <x>= k/. The variance is = k/. Special cases:

16 30 CHAPTER. PROBABILITY DISTRIBUTIONS Figure..0: The gamma distribution for di erent values of the parameters. for =/ and k = n/ the gamma distribution has the form of a distribution with n degrees of freedom. The Gamma distribution describes the distribution of the time t = x between the first and the k th event in a Poisson process with mean. The parameter k influences the shape of the distribution, whereas is just a scale parameter...7 Student s t-distribution The Student s t distribution is used when the standard deviation of the parent distribution is unknown and the one evaluated from the sample s is used. Suppose that x is a random variable distributed normally (mean µ, variance ), and a measurement of x yields the sample mean x and the sample variance s. Then the variable t is defined as t = ( x µ)/ s/ = x µ. (..44) s You can see that we have canceled our ignorance about the standard deviation of the parent distribution by taking the ratio. The net e ect is to substitute with the estimated s. The quantity t follows the p.d.f. f n (t) withn degrees of freedom (see Fig...): f n (t) = p (n+)/ ((n + )/) + t (..45) n (n/) n If the true mean µ is known, then n = N with N being the number of measurements, if the true mean is unknown then n = N because one degree of freedom is used for the sample mean x. The distribution depends only on the sample mean x and the sample variance s.

17 F Distribution 3 Figure..: The Student s distribution for di erent values of n. In general, the variable: T = Z p S /n (..46) is governed by the Student s t distribution for n degrees of freedom if Z and S are two independent random variables following respectively a normal distribution N (0, ) and the distribution (n) with n degrees of freedom...8 F Distribution Consider two random variables, and, distributed as with and degrees of freedom, respectively. We define a new random variable F as: F = / / (..47) The random variable F follows the distribution (see Fig...): f(f )= n / n n ((n + n )/) (n /) (n /) F (n )/ + n (n +n )/ F. (..48) n This distribution is known by many names: Fisher-Snedecor distribution, Fisher distribution, Snedecor distribution, variance ratio distribution, and F -distribution. By convention, one usually puts the larger value on top so that F. The F distribution is used to test the statistical compatibility between the variances of two di erent samples, which are obtained form the same underlying distribution (more in section 8.6).

18 3 CHAPTER. PROBABILITY DISTRIBUTIONS Figure..: The F-distribution distribution for di erent values of the parameters...9 Weibull Distribution The Weibull distribution (see Fig...3) was originally invented to describe the rate of failures of light bulbs as they are getting older: P (x;, )= ( x) e ( x). (..49) with t 0 and, > 0. The parameter is just a scale factor and describes the width of the maximum. The exponential distribution is a special case ( = ), when the probability of failure at time t is independent of t. The Weibull distribution is very useful to describe the reliability and to predict failure rates. The expectation value of the Weibull distribution is ( + ) and the variance is Cauchy (Breit-Wigner) Distribution The Cauchy probability density function is: f(x) = +x. (..50) For large values of x it decreases only slowly. Neither the mean nor the variance are defined, because the corresponding integrals are divergent. The particular Cauchy distribution of the form f(m; M, ) = (m M) +( /) (..5) is also called Breit-Wigner function (see Fig...4), and is used in particle physics to describe cross sections near a resonance with mass M and width 4. The Breit-Wigner 4 Even if the mean does not exist, noting that the distribution is symmetric, we can define it to be M; is the FWHM. In actual physical problems the distribution is truncated, e.g., by energy conservation, and the resulting distribution is well-behaved.

19 Cauchy (Breit-Wigner) Distribution 33 Figure..3: The Weibull distribution for di erent values of the parameters. Figure..4: The Breit-Wigner distribution for di erent values of the parameters. comes as the Fourier transformation of the wave function of an unstable particle: (t) / e ie it/ h e t/ (..5) (!) / Z 0 (t)e i!t i dt = (!! 0 + i ) (..53)

20 34 CHAPTER. PROBABILITY DISTRIBUTIONS Figure..5: Straggling functions in silicon for 500 MeV pions, normalized to unity at the most probable value p/x. Thewidthw is the full width at half maximum.[4] which squared gives: (!) = (!! 0 ) + 4 (..54).. Landau Distribution The Landau distribution (see Fig...5) is used to describe the distribution of the energy loss x (dimensionless quantity proportional to de) of a charged particle (by ionisation) passing through a thin layer of matter: p(x) = i Z c+i c i exp(s log s + xs)ds (c >0). (..55) The long tail towards large energies models the large energy loss fluctuations in thin layers. The mean and the variance of the distribution are not defined. For numerical purposes the distribution can be rewritten as: p(x) = Z 0 exp( t log t xt)sin( t)dt (..56).3 Characteristic Function The characteristic function (t) is defined as the expectation value of e itx for a p.d.f. f(x): Z (t) :=<e itx >= e itx f(x)dx (.3.57) i.e (t) is the Fourier integral of f(x). The characteristic function completely determines the p.d.f., since by inverting the Fourier transformation we regain f(x): f(x) = Z e itx (t)dt (.3.58)

21 .4. THE CENTRAL LIMIT THEOREM 35 The characteristic function as well as its first and second derivative are readily calculated for the special case where t = 0: (0) = (.3.59) d (0) dt = i<x> (.3.60) d (0) dt = ( + <x> ) (.3.6) What do we need characteristic functions for? They may be useful when performing calculations with probability densities, for example if the convolution of two probability densities f and f for two random variables x and x should be calculated. A convolution of f and f yields a new probability density g(y), according to which the sum of the random variable y = x + x is distributed: Z Z Z Z g(y) = f (x )f (x ) (y x x )dx dx = f (x )f (y x )dx = f (x )f (y x )dx (.3.6) The convolution integral can now be transformed with the help of the characteristic functions: g(t) = f (t) f (t) (.3.63) In words: The characteristic function of the convolution of two variables is obtained by the product of their characteristic functions. Thus it can be easily shown that the convolution of two Gaussian distributions with µ, and, is again a Gaussian distribution with µ = µ +µ and = +. Furthermore, the convolution of two Poisson distributions is again a Poisson distribution. The characteristic functions of some probability densities are given in table.3.. Table.3.: Characteristic functions of some probability densities. Distribution Characteristic Function Binomial (t) =(pe it + q) n Poisson (t) =e (eit ) Gauss (t) =e iµt t / (t) =( it) n/ Uniform (from a to b) (t) =(e ibt e iat )/(b a) it Breit-Wigner (t) =e ie 0t ( /) t Gamma (t) =( it/µ).4 The Central Limit Theorem The Central Limit Theorem (CLT) is probably the most important theorem in statistics and it is the reason why the gaussian distribution is so important. Take n independent variables x i, distributed according to p.d.f. s f i having mean µ i and variance i, then the p.d.f. of the sum of the x i, S = P x i, has mean P µ i and variance

22 36 CHAPTER. PROBABILITY DISTRIBUTIONS Figure.4.6: Sum of random variables from a uniform distribution in [, ] after the iterations to 4. P i and it approaches the normal p.d.f. N(S; P µ i, P i ) as n!. The CLT holds under pretty general conditions: both mean and variance have to exist for each of the random variables in the sum Lindeberg criteria: y k = x k, if x k µ k apple k k y k =0, if x k µ k > k k. Here, k is an arbitrary number. If the variance (y + y + y n )/ y! for n!, then this condition is fulfilled for the CLT. In plain English: The Lindeberg criteria ensures that fluctuations of a single variable does not dominate its sum. An example of convergence for the CLT is given in Fig..4.6 where a uniform distribution is used for 4 iterations. When performing measurements the value obtained is usually a ected by a large number of (hopefully) small uncertainties. If this number of small contributions is large the C.L.T. tells us that their total sum is Gaussian distributed. This is often the case and is the reason resolution functions are usually Gaussian. But if there are only a few contributions, or if a few of the contributions are much larger than the rest, the C.L.T. is not applicable, and the sum is not necessarily Gaussian..5 References Pretty much every book about statistics and probability will cover the material of this chapter. Here a few examples:

23 .5. REFERENCES 37 L. Lyons, Statistics for Nuclear and Particle Physicist : Ch. 3 W. Metzger, Statistical Methods in Data Analysis : Ch. PDG, Passage of particles through matter

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics. Probability Distributions Prof. Dr. Klaus Reygers (lectures) Dr. Sebastian Neubert (tutorials) Heidelberg University WS 07/8 Gaussian g(x; µ, )= p exp (x µ) https://en.wikipedia.org/wiki/normal_distribution

More information

Statistical Methods in Particle Physics

Statistical Methods in Particle Physics Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative

More information

Statistics, Data Analysis, and Simulation SS 2013

Statistics, Data Analysis, and Simulation SS 2013 Statistics, Data Analysis, and Simulation SS 213 8.128.73 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 23. April 213 What we ve learned so far Fundamental

More information

Statistics, Data Analysis, and Simulation SS 2017

Statistics, Data Analysis, and Simulation SS 2017 Statistics, Data Analysis, and Simulation SS 2017 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2017 Dr. Michael O. Distler

More information

Error analysis in biology

Error analysis in biology Error analysis in biology Marek Gierliński Division of Computational Biology Hand-outs available at http://is.gd/statlec Errors, like straws, upon the surface flow; He who would search for pearls must

More information

Lectures on Statistical Data Analysis

Lectures on Statistical Data Analysis Lectures on Statistical Data Analysis London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk

More information

Chapter 4: An Introduction to Probability and Statistics

Chapter 4: An Introduction to Probability and Statistics Chapter 4: An Introduction to Probability and Statistics 4. Probability The simplest kinds of probabilities to understand are reflected in everyday ideas like these: (i) if you toss a coin, the probability

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Scientific Measurement

Scientific Measurement Scientific Measurement SPA-4103 Dr Alston J Misquitta Lecture 5 - The Binomial Probability Distribution Binomial Distribution Probability of k successes in n trials. Gaussian/Normal Distribution Poisson

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Probability Distributions - Lecture 5

Probability Distributions - Lecture 5 Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore

More information

Basic probability. Inferential statistics is based on probability theory (we do not have certainty, but only confidence).

Basic probability. Inferential statistics is based on probability theory (we do not have certainty, but only confidence). Basic probability Inferential statistics is based on probability theory (we do not have certainty, but only confidence). I Events: something that may or may not happen: A; P(A)= probability that A happens;

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

PROBABILITY DISTRIBUTION

PROBABILITY DISTRIBUTION PROBABILITY DISTRIBUTION DEFINITION: If S is a sample space with a probability measure and x is a real valued function defined over the elements of S, then x is called a random variable. Types of Random

More information

Statistics and data analyses

Statistics and data analyses Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

functions Poisson distribution Normal distribution Arbitrary functions

functions Poisson distribution Normal distribution Arbitrary functions Physics 433: Computational Physics Lecture 6 Random number distributions Generation of random numbers of various distribuition functions Normal distribution Poisson distribution Arbitrary functions Random

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Appendix A. Math Reviews 03Jan2007. A.1 From Simple to Complex. Objectives. 1. Review tools that are needed for studying models for CLDVs.

Appendix A. Math Reviews 03Jan2007. A.1 From Simple to Complex. Objectives. 1. Review tools that are needed for studying models for CLDVs. Appendix A Math Reviews 03Jan007 Objectives. Review tools that are needed for studying models for CLDVs.. Get you used to the notation that will be used. Readings. Read this appendix before class.. Pay

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Data Analysis and Monte Carlo Methods

Data Analysis and Monte Carlo Methods Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -

More information

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability? Probability: Why do we care? Lecture 2: Probability and Distributions Sandy Eckel seckel@jhsph.edu 22 April 2008 Probability helps us by: Allowing us to translate scientific questions into mathematical

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Counting principles, including permutations and combinations.

Counting principles, including permutations and combinations. 1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each

More information

Some Statistics. V. Lindberg. May 16, 2007

Some Statistics. V. Lindberg. May 16, 2007 Some Statistics V. Lindberg May 16, 2007 1 Go here for full details An excellent reference written by physicists with sample programs available is Data Reduction and Error Analysis for the Physical Sciences,

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

PROBABILITY DISTRIBUTIONS: DISCRETE AND CONTINUOUS

PROBABILITY DISTRIBUTIONS: DISCRETE AND CONTINUOUS PROBABILITY DISTRIBUTIONS: DISCRETE AND CONTINUOUS Univariate Probability Distributions. Let S be a sample space with a probability measure P defined over it, and let x be a real scalar-valued set function

More information

RWTH Aachen Graduiertenkolleg

RWTH Aachen Graduiertenkolleg RWTH Aachen Graduiertenkolleg 9-13 February, 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan Course web page: www.pp.rhul.ac.uk/~cowan/stat_aachen.html

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Introduction and Overview STAT 421, SP Course Instructor

Introduction and Overview STAT 421, SP Course Instructor Introduction and Overview STAT 421, SP 212 Prof. Prem K. Goel Mon, Wed, Fri 3:3PM 4:48PM Postle Hall 118 Course Instructor Prof. Goel, Prem E mail: goel.1@osu.edu Office: CH 24C (Cockins Hall) Phone: 614

More information

Lecture 2. Distributions and Random Variables

Lecture 2. Distributions and Random Variables Lecture 2. Distributions and Random Variables Igor Rychlik Chalmers Department of Mathematical Sciences Probability, Statistics and Risk, MVE300 Chalmers March 2013. Click on red text for extra material.

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008

Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 Probability theory for Networks (Part 1) CS 249B: Science of Networks Week 02: Monday, 02/04/08 Daniel Bilar Wellesley College Spring 2008 1 Review We saw some basic metrics that helped us characterize

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Lecture 2: Probability and Distributions

Lecture 2: Probability and Distributions Lecture 2: Probability and Distributions Ani Manichaikul amanicha@jhsph.edu 17 April 2007 1 / 65 Probability: Why do we care? Probability helps us by: Allowing us to translate scientific questions info

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

STT 315 Problem Set #3

STT 315 Problem Set #3 1. A student is asked to calculate the probability that x = 3.5 when x is chosen from a normal distribution with the following parameters: mean=3, sd=5. To calculate the answer, he uses this command: >

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

ECE 650 1/11. Homework Sets 1-3

ECE 650 1/11. Homework Sets 1-3 ECE 650 1/11 Note to self: replace # 12, # 15 Homework Sets 1-3 HW Set 1: Review Assignment from Basic Probability 1. Suppose that the duration in minutes of a long-distance phone call is exponentially

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 26, 2018 CS 361: Probability & Statistics Random variables The discrete uniform distribution If every value of a discrete random variable has the same probability, then its distribution is called

More information

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Probability Distributions for Continuous Variables Let X = lake depth at a randomly chosen point on lake surface If we draw the histogram so that the

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Order Statistics and Distributions

Order Statistics and Distributions Order Statistics and Distributions 1 Some Preliminary Comments and Ideas In this section we consider a random sample X 1, X 2,..., X n common continuous distribution function F and probability density

More information

Introduction to Statistics and Error Analysis

Introduction to Statistics and Error Analysis Introduction to Statistics and Error Analysis Physics116C, 4/3/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

ECE-580-DOE : Statistical Process Control and Design of Experiments Steve Brainerd 27 Distributions:

ECE-580-DOE : Statistical Process Control and Design of Experiments Steve Brainerd 27 Distributions: Distributions ECE-580-DOE : Statistical Process Control and Design of Experiments Steve Brainerd 27 Distributions: 1/29/03 Other Distributions Steve Brainerd 1 Distributions ECE-580-DOE : Statistical Process

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

18.175: Lecture 13 Infinite divisibility and Lévy processes

18.175: Lecture 13 Infinite divisibility and Lévy processes 18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Probability and Probability Distributions. Dr. Mohammed Alahmed

Probability and Probability Distributions. Dr. Mohammed Alahmed Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Module 8 Probability

Module 8 Probability Module 8 Probability Probability is an important part of modern mathematics and modern life, since so many things involve randomness. The ClassWiz is helpful for calculating probabilities, especially those

More information

Lecture 2: Discrete Probability Distributions

Lecture 2: Discrete Probability Distributions Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture

More information

CHAPTER 14 THEORETICAL DISTRIBUTIONS

CHAPTER 14 THEORETICAL DISTRIBUTIONS CHAPTER 14 THEORETICAL DISTRIBUTIONS THEORETICAL DISTRIBUTIONS LEARNING OBJECTIVES The Students will be introduced in this chapter to the techniques of developing discrete and continuous probability distributions

More information

Probability and Statistics

Probability and Statistics 10.9.013 jouko.teeriaho@ramk.fi RAMK fall 013 10.9.013 Probability and Statistics with computer applications Contents Part A: PROBABILITY 1. Basics of Probability Classical probability Statistical probability

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Chapter 8 Sequences, Series, and Probability

Chapter 8 Sequences, Series, and Probability Chapter 8 Sequences, Series, and Probability Overview 8.1 Sequences and Series 8.2 Arithmetic Sequences and Partial Sums 8.3 Geometric Sequences and Partial Sums 8.5 The Binomial Theorem 8.6 Counting Principles

More information

Question Points Score Total: 76

Question Points Score Total: 76 Math 447 Test 2 March 17, Spring 216 No books, no notes, only SOA-approved calculators. true/false or fill-in-the-blank question. You must show work, unless the question is a Name: Question Points Score

More information

YETI IPPP Durham

YETI IPPP Durham YETI 07 @ IPPP Durham Young Experimentalists and Theorists Institute Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan Course web page: www.pp.rhul.ac.uk/~cowan/stat_yeti.html

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

18.175: Lecture 17 Poisson random variables

18.175: Lecture 17 Poisson random variables 18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More

More information

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution 1 ACM 116: Lecture 2 Agenda Independence Bayes rule Discrete random variables Bernoulli distribution Binomial distribution Continuous Random variables The Normal distribution Expected value of a random

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Probability Distributions

Probability Distributions 02/07/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 05 Probability Distributions Road Map The Gausssian Describing Distributions Expectation Value Variance Basic Distributions Generating Random

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Chapter 3. Julian Chan. June 29, 2012

Chapter 3. Julian Chan. June 29, 2012 Chapter 3 Julian Chan June 29, 202 Continuous variables For a continuous random variable X there is an associated density function f(x). It satisifies many of the same properties of discrete random variables

More information

Introducing the Normal Distribution

Introducing the Normal Distribution Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 219 Lecture 1: Introducing the Normal Distribution Relevant textbook passages: Pitman [5]: Sections 1.2, 2.2,

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Topic 3 Random variables, expectation, and variance, II

Topic 3 Random variables, expectation, and variance, II CSE 103: Probability and statistics Fall 2010 Topic 3 Random variables, expectation, and variance, II 3.1 Linearity of expectation If you double each value of X, then you also double its average; that

More information

Frank Porter April 3, 2017

Frank Porter April 3, 2017 Frank Porter April 3, 2017 Chapter 1 Probability 1.1 Definition of Probability The notion of probability concerns the measure ( size ) of sets in a space. The space may be called a Sample Space or an Event

More information

Radioactivity. PC1144 Physics IV. 1 Objectives. 2 Equipment List. 3 Theory

Radioactivity. PC1144 Physics IV. 1 Objectives. 2 Equipment List. 3 Theory PC1144 Physics IV Radioactivity 1 Objectives Investigate the analogy between the decay of dice nuclei and radioactive nuclei. Determine experimental and theoretical values of the decay constant λ and the

More information

Lecture 2 Binomial and Poisson Probability Distributions

Lecture 2 Binomial and Poisson Probability Distributions Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James

More information

04. Random Variables: Concepts

04. Random Variables: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

Numerical Methods for Data Analysis

Numerical Methods for Data Analysis Michael O. Distler distler@uni-mainz.de Bosen (Saar), August 29 - September 3, 2010 Fundamentals Probability distributions Expectation values, error propagation Parameter estimation Regression analysis

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

First and Last Name: 2. Correct The Mistake Determine whether these equations are false, and if so write the correct answer.

First and Last Name: 2. Correct The Mistake Determine whether these equations are false, and if so write the correct answer. . Correct The Mistake Determine whether these equations are false, and if so write the correct answer. ( x ( x (a ln + ln = ln(x (b e x e y = e xy (c (d d dx cos(4x = sin(4x 0 dx xe x = (a This is an incorrect

More information

Probability Distribution

Probability Distribution Economic Risk and Decision Analysis for Oil and Gas Industry CE81.98 School of Engineering and Technology Asian Institute of Technology January Semester Presented by Dr. Thitisak Boonpramote Department

More information

Statistical Data Analysis 2017/18

Statistical Data Analysis 2017/18 Statistical Data Analysis 2017/18 London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk

More information