II. Introduction to probability, 2
|
|
- Grant Hunter
- 5 years ago
- Views:
Transcription
1 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 1 II. Introduction to probability, 2 1 Random Variables 1.1 Definition: A random variable is a function defined on a sample space. In other words, it is a mapping of events to numbers. 1.2 This can be a simple one-to-one mapping Example: Throw a six-sided die, and let the random variable X be the face value or any specified function Example: In our previous 8-point sample space of the outcomes of three coin tosses, we can define a random variable X as the number of heads. We then have: Event X HHH 3 HHT 2 HTH 2 HTT 1 TTT 0 TTH 1 THT 1 THH Random variables and probabilites (illustrate with sample space of discrete points): Let x 1, x 2,... be all the values that the random variable X can take on. Then we denote the probability that X takes on the value x j as P (X = x j ) = f(x j ). In the previous example of the die, assuming it is fair, we have:
2 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 2 x j f(x j ) 1 1/6 2 1/6 3 1/6 4 1/6 5 1/6 6 1/6 And in the example of the sum of heads in three coin tosses, we have: x j f(x j ) 0 1/8 1 3/8 2 3/8 3 1/8 2 Density and Distribution 2.1 Density: The density function f(x) is proportional to the probability that a random variable will take on a value between x and x + δx, where δx is an infinitesimal increment. (Strictly speaking, with a continuous distribution, the probability of taking on exactly a particular value is vanishingly small.) By definition: f(x) dx = 1 if f(x) is a density function. 2.2 Distribution: The distribution function F (x) is the probability that the random variable will take on a value less than or equal to x F (x) = P (X x) = x d[f (x)] f(y) dy, and f(x) = dx So, if we know the density we can determine the distribution function, and vice versa. 2.3 Discrete case: f(x j ) = P (X = x j ) is the probability distribution, and F (x) = P (X x) = x j x f(x j) is the distribution function.
3 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Examples Exponential with parameter r density: f(x) = re rx distribution: F (x) = 1 e rx Exponential with parameter r= f(x) x F(x) x
4 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 4 Derivation of these functions (in relation to waiting times): Suppose we have a Poisson process with uniform rate or probability of occurrence. Then the waiting times between successive events are exponentially distributed. To see why this should be the case, suppose the process goes on at an instantaneous rate of r acting over a span of time t. Imagine that we subdivide t into n fine increments of length t/n each. There will be on average rt events over the span of time t. The probability that the event will occur in one of these fine increments is rt/n. The probability that it will not occur in one of these increments is (1 rt/n). The probability that it will not occur in n successive increments is therefore (1 rt/n) n. As n, so that we are now dealing with a continuous-time process, we have. Pr(no events in t) = e rt This is the same as the probability that the waiting time is greater than t. Thus the probability that the waiting time is less than or equal to t is 1 e rt. This last quantity is the distribution F (t), and its first derivative, f(t) = re rt, is the density Uniform density and distribution on (a, b) density: f(x) = 1 b a distribution: F (x) = x a 1 x a dy = b a b a applications: probability spatial and temporal pattern of events dropped with constant
5 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Normal density: f(x) = 1 2π e x2 /2 distribution: F (x) = 1 2π x e y2 /2 dy
6 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Discrete functions we have already considered: binomial, multinomial, Poisson 3 Expectation (i.e. mean) 3.1 continuous: E(X) = xf(x) dx 3.2 discrete: E(X) = j x jf(x j ) I.e. the expectation is the sum of all values of a random variable, weighted by the density or probability. 3.3 Examples Exponential with rate r: E(X) = 1/r Binomial: E(k) = np; E( k n ) = p Poisson: E(k) = λ Uniform on (a, b): E(X) = b+a Working with expectations (illustrate with discrete case) Let g(x) be some function of x. Then E[g(x)] = j g(x j)f(x j ). For example, E(X 2 ) = j x j 2 f(x j ) Sums: Suppose there are several random variables X, Y, Z, etc. Then E(X + Y + Z) = E(X) + E(Y ) + E(Z) Let a be a constant. Then E(aX) = ae(x) Products: In general, E(XY ) E(X)E(Y ). Exception: E(XY ) = E(X)E(Y ) if X and Y are mutually independent random variables.
7 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 7 4 Median: Value of x at which F (x) = Example: Exponential: Med(X) = ln(2)/r 5 Mode: Value of x with maximal value of f(x). 5.1 Example: Exponential: Mode(X) = Comment: Gaps between fossil finds are exponentially distributed. Modal gap of zero implies that the single most probable outcome is that there is an infinitesimally small offset between origin and first appearance or between extinction and last appearance. But this still an exceedingly improbably outcome when compared with the sum of all possible alternatives. 6 Variance 6.1 Definition: V (X) = E(X 2 ) [E(X)] 2 = E{[X E(X)] 2 } 6.2 What it captures: Average squared deviation between a random variable and its mean. 6.3 Examples Binomial: V (k) = np(1 p); V ( k ) = p(1 p)/n n Poisson: V (k) = λ NB mean=variance for Poisson. with model of Poisson process. This can be useful in testing whether data agree This variance may seem different from that in the binomial, but consider that np = λ and that, as the probability per trial becomes ever smaller, (1 p) 1, therefore np(1 p) np = λ.
8 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Exponential with parameter r: V (X) = 1/r 2, i.e. V (X) = E(X) Uniform on (a, b): V (X) = (b a) Covariance 7.1 Definition: cov(x, Y ) = E(XY ) E(X)E(Y ) = E{[X E(X)][Y E(Y )]}, for two random variables X and Y. 7.2 What it captures: Whether positive deviations from mean of one random variable are matched by deviations in the other random variable that are, on average, positive, negative, or zero. 8 Variance of a Sum 8.1 Let X 1,..., X n be a number of random variables with finite variances σ 2 i. 8.2 Let S n be a new random variable defined as S n = X X n. 8.3 Then V (S n ) = n k=1 σ 2 k + 2 j k cov(x j, X k ) 8.4 Thus, if X 1,..., X n are mutually independent, V (S n ) = n k=1 σ2 k. 8.5 Application: Chance fluctuations about a mean probability Suppose there are n independent trials, each with probability of success p k, where k = 1,...n. Let X k, k = 1,..., n be a random variable which takes on values 0 and 1 with probabilities (1 p k ) and p k. For each given k, we know that E(X k ) = p k and V (X k ) = p k (1 p k ) (binomial). Now let S n = X X n. This is the total number of successes in the n trials. E(S n ) = n k=1 p k. V (S n ) = n k=1 σ2 k, because, by assumption, the trials are independent.
9 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 9 Rewrite V (S n ) as k p k(1 p k ) = k p k k p k 2 = n p k p k 2. Now vary the set of p k with the constraint that k p k = np. k p k 2 is minimized, therefore the variance is maximized, when all p k are equal. Counterintuitive? To quote Feller (p. 231): Given a certain average quality p of n machines, the output will be least uniform if all the machines are equal. (An application to modern education is obvious but hopeless.) 9 Central Limit Theorem: 9.1 Suppose there is a probability distribution with mean µ and variance σ Sum, S n, of n independent draws from this distribution has expectation E(S n ) = nµ and variance V (S n ) = nσ 2. This we have already seen. 9.3 This sum tends to be normal as n. In particular, Z n = S n nµ σ n converges on the standard normal (mean=0, variance=1) as n increases.
10 f 2 f 3 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Applications to Novel Situations 10.1 Expected value of FreqRat Model: see Foote and Raup (1996); much more on this later Extinction: time intervals Constant at rate q per lineage-million-years; durations discretized to t unit P (duration t) = e qt (waiting time until extinction) P (duration between t and t + 1) = e qt e q(t+1) Sampling: Constant at probability R per interval Figure 1. Ideal frequency distribution under model of constant extinction and sampling f Probability of sampling per interval f 2 2 f 1 f Frequency Stratigraphic range (number of intervals)
11 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Results of Model: 1. If single-interval taxa ignored, exponential distribution of stratigraphic ranges is exponential with parameter q (just like distribution of true durations). 2. FreqRat (f 2 2 /(f 1 f 3 )) equals R, for ideal distribution (in absence of sampling variation) Check for bias: What is the expected value of the observed FreqRat for a finite sample of data? Consider four outcomes: range=1, range=2, range=3, range>3. Let f 1,..., f 4 be the true probabilities of having the given range, according to the stated model and known values of q and R. Let the observed numbers of taxa falling into each of these categories be denoted k 1,..., k 4. Let N = i k i. Then, by the multinomial distribution, the probability of observing the given set of k i is: P (k 1, k 2, k 3, k 4 ) = N! k 1!k 2!k 3!k 4! f 1 k 1 f 2 k 2 f 3 k 3 f 4 k 4 The expected value of the observed FreqRat is: E[k 2 2 /(k 1 k 3 )] = N N k1 N k1 k 2 k 1 =1 k 2 =1 k 3 =1 [k 2 2 /(k 1 k 3 )] P (k 1, k 2, k 3, k 4 ) N N k1 N k1 k 2 k 1 =1 k 2 =1 k 3 =1 P (k 1, k 2, k 3, k 4 ) This sum is taken only over those values of k i for which the FreqRat can be defined. Thus, the normalization by sum of probabilities in the denominator is needed. I.e. we are dealing with conditional probability of observing a given FreqRat, given that the FreqRat is definable. Example: q = 0.25, R = 0.25 f 1 = f 2 = f 3 =
12 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 12 f 2 2 /(f 1 f 3 ) = N E(F reqrat) Thus, the FreqRats reported by Foote and Raup (1996) are probably overestimates. FreqRats reported by Foote and Raup (1996) Group Taxonomic Level Stratigraphic inteval N FreqRat Trilobites species 60-foot intervals Crinoids genus stage Mammals species 0.7 m.y Bivalves species 5 m.y
13 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Expectation and variance of nearest-neighbor distance Application: searching for clustering of points in morphospace or other space (see Foote, 1990, Systematic Zoology 39: ) Assume points randomly dropped in N-dimensional space Volume of N-dimensional hypersphere: V N = πn/2 r N Γ( N 2 + 1), where r is the radius, and Γ is the interpolation of factorials. Note that Γ(m + 1) = m! and Γ(m ) = π (2m)! 2 2m m! Let a = πn/2 Γ( N 2 +1), so that V N = ar N Let ρ be the point density (total points total volume) Then mean number of points in hypersphere of radius r is given by λ = ρar N Probability of no points in a hypersphere of radius r is given by P (0) = e λ (Poisson) Note that this is the probability that a randomly chosen point will have no points within a distance r of it, i.e. that its nearest-neighbor distance will be greater than r. Therefore, the proportion of all nearest-neighbor distances less than or equal to r is given by 1 e λ. This leads immediately to the distribution function Distribution: F (r) = 1 e λ = 1 e ρarn Take the first derivative of this with respect to r to get the density function
14 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Density: f(r) = d dr (1 e ρarn ) = ρanr N 1 e ρarn Expectation E(r) = which, after consulting a table of integrals, yields where a and ρ are as defined earlier. 0 E(r) = r ρanr N 1 e ρarn, Γ( N+1 N ) a 1/N ρ 1/N, Variance: which is equal to from which we get E(r 2 ) = 0 E(r 2 ) = r 2 ρanr N 1 e ρarn, Γ( N+2 N ) a 2/N ρ 2/N, V (r) = E(r 2 ) [E(r)] 2 = N+2 N+1 Γ( ) [Γ( N N )]2, a 2/N ρ 2/n where a and ρ are as defined above Check equations with N = 2 (which has already been solved see Clark and Evans, 1954, Ecology 35: ): E(r) = 1/(2 ρ) (OK) V (r) = 4 π 4πρ (OK)
15 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Expected correlation between origination rate, extinction rate, and net rate of diversification (From Foote (2006) Paleobiology [in press]) Motivating question: How is variation in rate of diversification partitioned into origination and extinction components? Foote Figure 1 (two columns) A Change in diversification rate Change in origination rate Change in extinction rate 0.0 B Extinction diversification correlation Origination diversification correlation Foote Figure 3 (one column) Extinction diversification correlation Carbonate Clastic All genera Origination diversification correlation
16 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page Derivation of expected correlations, starting with variance in origination rate, variance in extinction rate, and correlation between origination and extinction Let p i, q i, and d i be the per-capita rate of origination, extinction, and net diversification in time interval i, where d i = p i q i. Let p i = p i p i 1, q i = q i q i 1, and d i = d i d i 1 be the first differences in these rates. Note that d i = p i q i. Let s 2 p, s 2 q, and s 2 d be the variances in p, q, and d taken over the series of time intervals, and let s pq, s pd, and s qd be the covariances between the subscripted variables. The variance of a random variable X is given by s 2 x = E(X 2 ) µ 2 x, (1a) where E(X 2 ) denotes the expectation of X 2 and µ x is the mean of X (Feller 1968: p. 227). Rearranging yields E(X 2 ) = s 2 x + µ 2 x. (1b) Similarly, the covariance between two random variables X and Y is given (Feller 1968: p. 230) by s xy = E(XY) µ x µ y, (2a) which implies that E(XY) = s xy + µ x µ y. Because d = p q, we also need the general expression for the variance of a difference between two random variables (Feller 1968: p. 230): (2b) s 2 (x y) = s 2 x + s 2 y 2s xy, (3) and for the covariance between this difference and either of the random variables: s x(x y) = E(X 2 ) E(XY) µ 2 x + µ x µ y, and s y(x y) = E(Y 2 ) + E(XY) + µ 2 y µ x µ y. Note that the product-moment correlation is given (Sokal and Rohlf 1995: p. 560) by (4a) (4b) r xy = s xy. (5) s 2 x s 2 y Suppose we are given s 2 p, s 2 q, and r pq and are to determine r pd and r qd from these quantities. Starting with r pd and substituting p and d in equation (5), p and q in equation
17 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 17 (4a), p and q in equation (3), p 2 in equation (1b), and p and q in equation (2b), we have: r pd = s pd s 2 ps 2 d = E(p2 ) E(pq) µ 2 p + µ p µ q s 2 p(s 2 p + s 2 q 2s pq ) = (s2 p + µ 2 p) (s pq + µ p µ q ) µ 2 p + µ p µ q s 2 p(s 2 p + s 2 q 2r pq s p s q ) = s 2 p r pq s p s q s 2 p(s 2 p + s 2 q 2r pq s p s q ) (6) Similarly, starting with r qd and substituting q and d in equation (5), p and q in equation (4a), p and q in equation (3), q 2 in equation (1b), and p and q in equation (2b), we have: r qd = r pq s p s q s 2 q s 2 q(s 2 p + s 2 q 2r pq s p s q ) (7) and Let s q be some multiple of s p, so that s q = ks p, s p s q = ks 2 p, and and s 2 q = k 2 s 2 p. Substituting into equations (6) and (7), we have: r pd = = s 2 p(1 r pq k) s 2 p(s 2 p + k 2 s 2 p 2r pq ks 2 p) 1 r pq k k2 2r pq k + 1 (8) and r qd = = r pq ks 2 p k 2 s 2 p k 2 s 2 p(s 2 p + k 2 s 2 p 2r pq ks 2 p) r pq k k2 2r pq k + 1 (9)
18 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 18 Figures show how r pd, r qd, and rpd 2 + r2 qd vary as a function of r pq. The figures depict three situations: k > 1, k = 1, and k < 1, i.e., s p < s q, s p = s q, and s p > s q. When s p < s q, r pd (solid line in Fig. 13A) decreases monotonically as r pq increases, while r qd (dashed line in Fig. 13A) reaches a minimum value at r pq = 1/k, i.e., at r pq = s p /s q, and this minimum is equal to (k 2 1)/k 2. The minimum of r qd coincides with r pd = 0. Therefore the minimum of r qd also corresponds to a minimum of rpd 2 + r2 qd, which is equal to (k 2 1)/k 2 (Fig. 13B).
19 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 19 Figure 13. Expected relationships among r pd, r qd, and r pq, when extinction rate is more variable than origination rate. For this figure, the ratio k of s q to s p is equal to 1.4. A, r pd (solid) and r qd (dashed) as a function of r pq. When extinction rate varies more than origination rate, r qd > r pd for all values of r pq, and r pd > 0 only if r pq < 1/k. B, r 2 pd + r2 qd as a function of r pq. Lines marked by values in the margins are cases discussed in the text. Foote Figure 13 (one column)! 1.0 A k = s q s p = 1.4 rpq = 1 k r pd (solid); r qd (dashed) !0.5!1.0 1 (1 + k 2 ) " (k 2 " 1) k 2 " k 2 (1 + k 2 ) 2.0 B k = s q s p = 1.4 rpq = 1 k rpq = 2k k r qd 2 2 r pd k 2 " 1 k 2 0.0!1.0! r pq
20 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 20 When k < 1, the behavior of r pd and r qd is reversed compared with the case where k > 1 (Fig. 14). Here, the minimum of r pd is equal to 1 k 2 and occurs when r pq = k; this corresponds to r qd = 0 and also to the minimum of r 2 pd + r2 qd = 1 k2. Figures 13 and 14 show that positive correlations between extinction and diversification and negative correlations between origination and diversification are theoretically possible. They are empirically uncommon, however (Foote 2000a; this study). When k = 1, r pd and r qd are equal in magnitude and opposite in sign for any value of r pq, decreasing in magnitude monotonically as r pq increases (Fig. 15A). As a result, r 2 pd + r2 qd also decreases monotonically as r pq increases (Fig. 15B). Regardless of the value of k, r 2 pd + r2 qd = 1 at two points: when r pq = 0 and when r pq = 2k/(k 2 + 1). r 2 pd + r2 qd is less than unity if r pq is between 0 and 2k/(k 2 + 1). A few special cases are worth noting: If k = 1, then r pd = (1 r pq )/2, r qd = (1 r pq )/2, and r 2 pd + r2 qd = 1 r pq (Fig. 15). If r pq = 0, then r pd = 1/ k 2 + 1, r qd = k/ k 2 + 1, and rpd 2 + r2 qd = 1 (Figs ). If k = 1 and r pq = 0, then r pd = 1/ 2, r qd = 1/ 2, and rpd 2 + r2 qd = 1 (Fig. 15). If, as in this study, we work with first differences, then, because d = p q, we simply substitute p, q, and d for p, q, and d in all the foregoing expressions.
21 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 21 Figure 14. Expected relationships among r pd, r qd, and r pq, when origination rate is more variable than extinction rate. For this figure, the ratio k of s q to s p is equal to 0.8. See Figure 13 for explanation. When origination rate varies more than extinction rate, r pd > r qd for all values of r pq, and r qd < 0 only if r pq < k. Foote Figure 14 (one column)! 1.0 A k = s q s p = 0.8 rpq = k r pd (solid); r qd (dashed) !0.5 1 (1 + k 2 ) 1 " k 2 " k 2 (1 + k 2 )! B k = s q s p = 0.8 rpq = k rpq = 2k k r qd 2 2 r pd " k 2 0.0!1.0! r pq
22 GEOS 33000/EVOL January 2006 updated January 10, 2006 Page 22 Figure 15. Expected relationships among r pd, r qd, r pq, when origination and extinction are equally variable. See Figure 13 for explanation. Foote Figure 15 (one column)! 1.0 A k = s q s p = r pd (solid); r qd (dashed) !0.5 " 1 2! B k = s q s p = r qd 2 2 r pd !1.0! r pq
I. Introduction to probability, 1
GEOS 33000/EVOL 33000 3 January 2006 updated January 10, 2006 Page 1 I. Introduction to probability, 1 1 Sample Space and Basic Probability 1.1 Theoretical space of outcomes of conceptual experiment 1.2
More informationJoint Distribution of Two or More Random Variables
Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few
More informationqe qt e rt dt = q/(r + q). Pr(sampled at least once)=
V. Introduction to homogeneous sampling models A. Basic framework 1. r is the per-capita rate of sampling per lineage-million-years. 2. Sampling means the joint incidence of preservation, exposure, collection,
More informationHomework 4 Solution, due July 23
Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationMore on Distribution Function
More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution
More informationRandom variables (discrete)
Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that
More informationMathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2
( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationFinal Exam # 3. Sta 230: Probability. December 16, 2012
Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets
More informationIAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES
IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the
More information27 Binary Arithmetic: An Application to Programming
27 Binary Arithmetic: An Application to Programming In the previous section we looked at the binomial distribution. The binomial distribution is essentially the mathematics of repeatedly flipping a coin
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationCS206 Review Sheet 3 October 24, 2018
CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important
More informationExpected Value 7/7/2006
Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided
More information2 (Statistics) Random variables
2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationRandom Variables and Expectations
Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure
More informationCME 106: Review Probability theory
: Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationReview of Statistics I
Review of Statistics I Hüseyin Taştan 1 1 Department of Economics Yildiz Technical University April 17, 2010 1 Review of Distribution Theory Random variables, discrete vs continuous Probability distribution
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationDiscrete Random Variable
Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationDiscrete Probability Refresher
ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationStochastic Models of Manufacturing Systems
Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/
More informationMath 510 midterm 3 answers
Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e
More informationCS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro
CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationStat 5101 Notes: Algorithms (thru 2nd midterm)
Stat 5101 Notes: Algorithms (thru 2nd midterm) Charles J. Geyer October 18, 2012 Contents 1 Calculating an Expectation or a Probability 2 1.1 From a PMF........................... 2 1.2 From a PDF...........................
More information9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.
Chapter 5: Probability and Discrete Probability Distribution Learn. Probability Binomial Distribution Poisson Distribution Some Popular Randomizers Rolling dice Spinning a wheel Flipping a coin Drawing
More informationIntroduction to bivariate analysis
Introduction to bivariate analysis When one measurement is made on each observation, univariate analysis is applied. If more than one measurement is made on each observation, multivariate analysis is applied.
More informationSUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)
SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems
More informationIntroduction to bivariate analysis
Introduction to bivariate analysis When one measurement is made on each observation, univariate analysis is applied. If more than one measurement is made on each observation, multivariate analysis is applied.
More informationMULTIVARIATE PROBABILITY DISTRIBUTIONS
MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined
More informationProblem 1. Problem 2. Problem 3. Problem 4
Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationWhat is Probability? Probability. Sample Spaces and Events. Simple Event
What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5
More informationMATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM
MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM YOUR NAME: KEY: Answers in Blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they
More information1 Basic continuous random variable problems
Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and
More informationNotation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.
Ch. 16 Random Variables Def n: A random variable is a numerical measurement of the outcome of a random phenomenon. A discrete random variable is a random variable that assumes separate values. # of people
More informationWeek 12-13: Discrete Probability
Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible
More informationData Analysis and Monte Carlo Methods
Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -
More informationp. 4-1 Random Variables
Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationStochastic processes Lecture 1: Multiple Random Variables Ch. 5
Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and
More informationProbability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.
Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationTutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term
Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term 2016-2017 Tutorial 3: Emergency Guide to Statistics Prof. Dr. Moritz Diehl, Robin
More informationSec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability
Section 7.2 Sec$on Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli Trials and the Binomial Distribution Random Variables
More informationRaquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010
Raquel Prado Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010 Final Exam (Type B) The midterm is closed-book, you are only allowed to use one page of notes and a calculator.
More informationSTAT 430/510 Probability Lecture 7: Random Variable and Expectation
STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula
More informationTheory of probability and mathematical statistics
Theory of probability and mathematical statistics Tomáš Mrkvička Bibliography [1] J. [2] J. Andďż l: Matematickďż statistika, SNTL/ALFA, Praha 1978 Andďż l: Statistickďż metody, Matfyzpress, Praha 1998
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationMean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102
Mean, Median and Mode Lecture 3 - Axioms of Probability Sta102 / BME102 Colin Rundel September 1, 2014 We start with a set of 21 numbers, ## [1] -2.2-1.6-1.0-0.5-0.4-0.3-0.2 0.1 0.1 0.2 0.4 ## [12] 0.4
More informationTopic 3 Random variables, expectation, and variance, II
CSE 103: Probability and statistics Fall 2010 Topic 3 Random variables, expectation, and variance, II 3.1 Linearity of expectation If you double each value of X, then you also double its average; that
More informationWhat is a random variable
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr
More informationLecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality
Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi
More informationChapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables
Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results
More informationMultivariate Distributions (Hogg Chapter Two)
Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous
More informationStatistics and data analyses
Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =
More information4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur
4th IIA-Penn State Astrostatistics School July, 2013 Vainu Bappu Observatory, Kavalur Laws of Probability, Bayes theorem, and the Central Limit Theorem Rahul Roy Indian Statistical Institute, Delhi. Adapted
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More informationPreliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com
1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics,
More informationIntroduction to Machine Learning
What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationOverview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis
Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse
More informationMATH1231 Algebra, 2017 Chapter 9: Probability and Statistics
MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationPractice Examination # 3
Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single
More informationMultivariate probability distributions and linear regression
Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationSTAT 516 Midterm Exam 3 Friday, April 18, 2008
STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationProbability. VCE Maths Methods - Unit 2 - Probability
Probability Probability Tree diagrams La ice diagrams Venn diagrams Karnough maps Probability tables Union & intersection rules Conditional probability Markov chains 1 Probability Probability is the mathematics
More informationGEOS 36501/EVOL January 2012 Page 1 of 23
GEOS 36501/EVOL 33001 13 January 2012 Page 1 of 23 III. Sampling 1 Overview of Sampling, Error, Bias 1.1 Biased vs. random sampling 1.2 Biased vs. unbiased statistic (or estimator) 1.3 Precision vs. accuracy
More informationRefresher on Discrete Probability
Refresher on Discrete Probability STAT 27725/CMSC 25400: Machine Learning Shubhendu Trivedi University of Chicago October 2015 Background Things you should have seen before Events, Event Spaces Probability
More informationDiscrete Probability Distribution
Shapes of binomial distributions Discrete Probability Distribution Week 11 For this activity you will use a web applet. Go to http://socr.stat.ucla.edu/htmls/socr_eperiments.html and choose Binomial coin
More informationSTA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS
STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru, Venkateswara Rao STA 2023 Spring 2016 1 1. A committee of 5 persons is to be formed from 6 men and 4 women. What
More informationAnalysis of Engineering and Scientific Data. Semester
Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each
More informationProbability theory. References:
Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationLecture 3 - Axioms of Probability
Lecture 3 - Axioms of Probability Sta102 / BME102 January 25, 2016 Colin Rundel Axioms of Probability What does it mean to say that: The probability of flipping a coin and getting heads is 1/2? 3 What
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationPractice Midterm 2 Partial Solutions
8.440 Practice Midterm 2 Partial Solutions. (20 points) Let X and Y be independent Poisson random variables with parameter. Compute the following. (Give a correct formula involving sums does not need to
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationTopic 3: The Expectation of a Random Variable
Topic 3: The Expectation of a Random Variable Course 003, 2016 Page 0 Expectation of a discrete random variable Definition: The expected value of a discrete random variable exists, and is defined by EX
More informationName: Firas Rassoul-Agha
Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE
More informationSTA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS
STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru Venkateswara Rao, Ph.D. STA 2023 Fall 2016 Venkat Mu ALL THE CONTENT IN THESE SOLUTIONS PRESENTED IN BLUE AND BLACK
More informationProbability (10A) Young Won Lim 6/12/17
Probability (10A) Copyright (c) 2017 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later
More informationChapter 8: An Introduction to Probability and Statistics
Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including
More information