Discrete Distributions

Size: px
Start display at page:

Download "Discrete Distributions"

Transcription

1 Discrete Distributions STA 281 Fall Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have the same sample space or the same form for the probabilities. For example, suppose X is a random variable with P(X=0)=0.3 and P(X=1)=0.7 while Y is a random variable with P(Y=0)=0.4 and P(Y=1)=0.6. These random variables are similar in that they have the same sample space S={0,1}. The random variables are different in that they assign different probabilities to the elements of the sample space. In fact, there are many random variables with the sample space S={0,1}. For every real number p in (0,1), we can define X such that P(X=1)=p and P(X=0)=1-p. These probabilities sum to 1 and are both non-negative because 0<p<1, thus the random variable satisfies the axioms of probability. When we have a set of similar random variables, we call this set a family of distributions. Typically families are indexed by a parameter such as p in the previous example. For each value of the parameter, we have a different random variable, but one that can be described by a common equation involving the parameter. The family we described in the previous example is called the Bernoulli family with parameter p. The parameter space is the possible values of the parameter. For the Bernoulli family, the parameter space is p (0,1). Bernoulli random variables are used in a variety of applications. The simplest example is flipping a fair coin. If we create a random variable X by assigning 1 to heads and 0 to tails, then X Bern(0.5) (since each outcome is equally likely, p must be 0.5). Bernoulli random variables are also used in voter polls. If Fred and Barney are running against each other in an election, there is some proportion p of people who will vote for Fred. If Fred is unpopular, then p may be small, if the race is close p may be near 0.5, or if Fred is winning by a landslide then p may be above 0.8. If we select one individual at random from the population and ask whether they are intending to vote for Fred, we get a yes or no answer. If we construct a random variable Y by assigning 1 to yes and 0 to no, then Y Bern(p). Any experiment resulting in two outcomes can be transformed to a Bernoulli by assigning one of the outcomes to 1 and the other outcome to 0. Because all Bernoulli random variables have a common set of possible values and a common form for the probabilities, we can solve the expectation and variance of a Bernoulli random variable in terms of the parameter p. Let X Bern(p). The expectation of the random variable is defined to be. The random variable X has two possible outcomes, 0 and 1, that occur with probabilities 1-p and p, so E[X]=p E[X 2 ]=p V[X]=p(1-p) So, for example, a Bern(0.7) random variable has mean 0.7 and variance Experiments Based on Bernoulli Distribution Bernoulli distributions form the basis for several important families of distributions. Complicated experiments can often be described by specific combinations of several Bernoulli random variables. 1

2 Since the Bernoulli distribution is simple, thinking of more complicated distributions in terms of Bernoulli random variables often makes them easier to understand. We will consider two experiments resulting from Bernoulli random variables. One experiment consists of drawing a sample of n Bernoulli random variables and counting how many ones appear in the sample. This experiment forms the basis of voter polling. Usually we don t just ask one person who they are going to vote for, but many. Gallup polls, for example, typically question around 1500 people. The count of the number of ones in the sample is used to make inferences about who will win the election. Another example is quality control, where we are interested in determining the proportion of products that are of acceptable quality. To assess the quality of the products, we test products as they come off the assembly line. Each product is classified as acceptable (1) or unacceptable (0). A count of acceptable items corresponds to a count of the number of ones. This description alone (take n Bernoulli s and count the number of successes) is not sufficient to specify the distribution. We will consider two specific variants. The first scenario, called a Binomial experiment, states that the n Bernoulli random variables are all independent and have the same probability of success p. If Y is the number of successes in this scenario, then Y has a Binomial distribution with parameters n and p (see section 2.1). When n is large and p is small, the Binomial distribution may be approximated by a Poisson distribution (see section 2.2). In contrast to a Binomial experiment, the n Bernoulli random variables may arise from sampling n items from a finite population divided into M successes and N-M failures. We have actually already considered this scenario. If Y counts the number of successes in this scenario, then Y has a Hypergeometric distribution with parameters N, M, and n (see section 2.3). Under some conditions, the Binomial distribution and the Hypergeometric distribution closely approximate each other (see section 2.4). The alternative experiment (as opposed to counting the number of successes in n trials) is to observe successive independent Bernoulli random variables, all with the same parameter p, and count the number of failures that occur before the r th success. Note in the previous experiment we fixed the number of trials, n, and counted the number of successes. We are doing the opposite here in that we have fixed the number of successes, r, and are counting the number of failures. We could equivalently count the number of trials, which is the number of failures plus the number of successes r. The number of failures before the r th success has a Negative Binomial distribution with parameters r and p (see section 2.5). 2.1 Binomial The Binomial distribution is one of the most commonly used distributions in statistics and, with the possible exception of the normal distribution, the distribution we will concentrate on most in this course. The binomial distribution describes the results of voter polls, clinical trials, and many other sampling procedures. A Binomial experiment has the following properties: 1. We observe a fixed, known number n of Bernoulli trials. 2. The Bernoulli trials are all independent with the same p. 3. The random variable of interest is the number of ones observed. The first requirement just says we know the maximum number of ones in advance. The second requirement is often satisfied in practice and proves mathematical convenience in describing a Binomial distribution. By assuming the Bernoulli s are independent, we may multiply probabilities together rather than using conditional probabilities. Assuming all the Bernoulli s have the same p further simplifies the calculations. 2

3 In a Binomial experiment we call our n Bernoulli trials X 1, X 2,,X n. We know that each X i Bern(p). Our random variable of interest is the number of ones among the Xi, so let us construct another random variable Y which is the sum of the n X i Such a random variable Y has a Binomial distribution with parameters n and p, written Y Bin(n,p). The Binomial random variable is just recording the number of ones in the sample, since each X i=1 increments Y while each X i=0 does nothing to Y. Recall that the distribution of a random variable is a list of the possible values combined with the probability of each value occurring. For simplicity, let us assume that n=4. If we observe 4 Bernoulli s and count the number of ones, we must observe 0, 1, 2, 3, or 4 ones. These are the possible values of Y. Calculating the probabilities of each of these outcomes requires looking at the individual Bernoulli probabilities. There are 4 Xi random variables, each with two possible values, 0 and 1. This results in 16 (2 4 ) possible combinations of the set (X1,X2,X3,X4) these are 0000, 0001, 0010,,1110,1111). Using the assumption that all the Xi are independent and have the same probability p, we may compute the probability of observing any particular set of Xi values. For example, the probability of observing X1=1, X2=0, X3=1, X4=1 is We may rewrite the intersection as a product because of the independence of the Xi. We may compute the probability for all of the sixteen combinations. In addition, for each combination we may compute the value of Y. For example, if X1=1, X2=0, X3=1, and X4=1, then Y= =3. The table shows the probabilities and Y for the sixteen combinations. X1 X2 X3 X4 Y Probability (1-p) p(1-p) p(1-p) p 2 (1-p) p(1-p) p 2 (1-p) p 2 (1-p) p 3 (1-p) p(1-p) p 2 (1-p) p 2 (1-p) p 3 (1-p) p 2 (1-p) p 3 (1-p) p 3 (1-p) p 4 Notice that all combinations with the same Y have the same probability. For example, all the rows where Y=2 have probability p 2 (1-p) 2. This is because all rows with Y=2 have 2 ones and 2 zeros, the ones occurring with probability p and the zeros occurring with probability (1-p). The sixteen combinations in the table are the sample space for the four Bernoulli s. To find P(Y=2), we may just 3

4 sum the probabilities of the outcomes corresponding to Y=2. There are six such combinations, each with probability p 2 (1-p) 2, so P(Y=2)=6p 2 (1-p) 2. We may also find P(Y=3) similarly. Each combination with Y=3 has 3 ones and 1 zero, so it occurs with probability p 3 (1-p). There are 4 rows with Y=3, so P(Y=3)=4p 3 (1-p). We could similarly determine the probabilities for the remaining values of Y. While n is always known in a Binomial experiment, we don t know it until we are told about it in the experiment, and so we want formulas for arbitrary values of n. If we count the number of ones for n Bernoulli trials, we will find the possible values are integers 0, 1, 2,, n. To find their probabilities, we could construct a table as with n=4, but if n is even moderately large the table will be too large to write conveniently. In principle such a table could be constructed and we could find the probabilities of each combination of the Xi. Each combination with y ones and n-y zeros will have probability p y (1-p) n-y (recall each one occurs with probability p and each zero occurs with probability 1-p). We must then multiply p y (1-p) n-y by the number of rows with Y=y to determine the probability. Each row with Y=y has y ones and n-y zeros. How many rows are there with y ones? This is the number of ways y ones may be placed in n slot, or n choose y. In general, if Y Bin(n,p), then To find the expectation of Y, we could look directly at the definition [ ]. This results in the complicated formula [ ] It is simpler to observe that Y is a linear combination of the Xi, and use the formulas for linear combinations to compute the mean and variance of Y. Recall that the mean and variance of each Xi are p and p(1-p) since each Xi is distributed Bern(p). Since we may derive [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] The expectation formula is intuitive. If we have 10 flips of a coin that is heads with probability 0.70, we expect about 10(0.7)=7 heads. Often we are more interested in the sample proportion than the actual number. For example, election polls usually report what percentage of those surveyed would vote for a particular candidate, not the actual count. If Y is the actual number of voters preferring the candidate, then the proportion that prefer the candidate is We may compute probabilities for sample proportions just as we would for counts. If we interview 1000 people, the probability the proportion will be is the same as the probability Y=712. The sample proportion is a linear transformation of Y, since. Using the formulas for the mean and variance of a linear transformation, we may derive 4

5 [ ] [ ] [ ] [ ] The expectation of is p, which indicates that, on average, the sample proportion is equal to the population proportion, one of several reasons the sample proportion is typically used to estimate p. The property that the estimator, on average, is equal to the actual value is called unbiasedness. The variance of decreases as the sample size n increases, because n is in the denominator of V[ ]. This indicates that as the sample size increases, the sample proportion becomes more precise (less variable). Combining the mean and variance of, we may observe that for large samples, the sample proportion is very likely to be close to the population proportion p. 2.2 Approximation of Binomial by Poisson Often, we are faced with a binomial experiment with a large n and a small p. For example, consider the number of pages in a book with typographical errors. After the book has been proofread many times, there is a small probability any given page will have an error, but a book contains many pages. A telephone company is often interested in how many people call their operator. They serve many customers, but in a given day each customer has only a small probability of calling the operator. In a football season, there is a small probability of a turnover on a given play, but a season consists of many plays. If we are interested in computing probabilities for the number of ones observed in a binomial experiment, we could use the binomial distribution. However, if n is large and p is small, the Poisson distribution is often used as a convenient approximation. As a rule of thumb, if n 100, p 0.05, and np 20, then a Bin(n,p) distribution may be accurately approximated with a Poisson distribution with parameter =np distribution, written Poi. As stated earlier, defining a random variable involves stating the possible values of the variable and giving their probabilities. If, then the possible values of X are all the nonnegative integers. Although the set of possible values differ between the Poisson and Binomial (the Binomial only has possible values 0,1,,n), the Poisson distribution places very little probability on values greater than n so the approximation is still reasonable. The possible outcomes have probabilities These probabilities sum to 1 since As an illustration, suppose we have a hotel with 100 rooms and that the probability any given room will ask for room service from 2AM to 6AM is Suppose we want to find exact and approximate probabilities that zero through five rooms will ask for room service. The exact distribution is Bin(n=100, p=0.05). The parameter values are sufficient to use a Poi( =np=5) distribution as an approximation. We computed probabilities according to both the Binomial and Poisson families. For example, let X be the number of rooms asking for room service. The exact probability (Binomial) that X=4 is 5

6 and the approximate probability (Poisson) is These are reasonably similar. Recall that the Poisson has an infinite number of possible values. We know that P(Y=101)=0, since there are not 101 rooms in the hotel. While the Poisson does give the event Y=101 nonzero probability, the approximate probability is While this value is not exactly 0, it is extremely close, so the approximation is still reasonable. For a fixed, the larger n is, the better the approximation. For X=0 through X=5, the approximate probabilities are the second row of the following table and the exact probabilities are the third row. Although not an exact approximation, the Poisson distribution is reasonable. The approximation improves as n increases, with np held constant. Suppose the hotel had 500 rooms and the probability any given room orders room service between 2AM and 6AM is The expected number of rooms ordering room service is also np=5, so a Poi(5) distribution could be used as an approximation. The exact probabilities are shown in the fourth row of the table. The approximation has improved accuracy for the larger n. If n is increased to 5000 and p decreased to (exact probabilities shown in the fifth row of the table), where np is also 5, the approximation is almost exact. Distribution Poi( =5) Bin(n=100, p=0.05) Bin(n=500, p=0.01) Bin(n=5000, p=0.001) If ), then E[X]= and V[X]=. These quantities approximate the Binomial expectation and variance. The expectations are the same, since =np. The variances are close, with the Poisson variance =np and the Binomial variance np(1-p). If p is small, the 1-p will be close to 1 so np(1-p) will be close to np. 2.3 Hypergeometric Distribution Suppose we have a population of N individuals consisting of M ones and N-M zeros (again, the ones and zeros may indicate any dichotomous (two valued) variable, such as gender). We sample n individuals from the population at random and let X be the number of ones in the sample. The random variable X then has a hypergeometric distribution with parameters N, M, and n, written HyperGeo(N,M,n). While we still have a sample of n individuals from a population and record the number of ones in the sample (same as the Binomial), here our sampling scheme creates dependencies among the Bernoulli trials. We could consider sampling n individuals at random to occur by picking a single individual at random from the N individuals, then picking a second individual from the N-1 individuals remaining, and so on. The Bernoulli s are dependent because whether a one or a zero is chosen on the second pick depends on whether a one or zero was chosen on the first pick. For example, suppose a class consists of 30 men and 10 women, and we select 5 students at random. The probability we select a man on the first pick is 30/40. If a man is chosen with the first pick, the probability a man is chosen on the second pick 6

7 is 29/39. In contrast, if a woman is chosen with the first pick, the probability a man is chosen with the second pick is 30/39. Since the probability of choosing a man with the second pick depends on the gender of the first person picked, the Bernoulli trials are dependent and thus the Binomial distribution does not apply. Although this is not a Binomial experiment, we have already computed the probabilities for this scenario. We want to find the probability of randomly selecting x ones from a population of M ones and N-M zeros. The probability X=x is The possible values of this distribution require some thought to determine. Clearly x must be an integer, and must be at least 0 and at most n. However, depending on the number of ones and zeros in the population, some values may not be possible. For example, if the population consists entirely of zeros, then the only possible value is x=0. The largest value x may be is n, provided there are enough ones in the population that n ones may be selected. If M is less than n, so there are fewer ones available that the number of individuals we are selecting, then we may observe at most M ones. The largest possible value of x is therefore min(n,m). Similarly, if the number of zeros in the population is fewer than the number of individuals we sample, then we must observe at least n-(n-m)=n-n+m ones. The possible values of x are therefore any integer between max(0,n-n+m) and min(n,m). Through some algebra, it is possible to determine the mean and variance of the hypergeometric distribution. If X HyperGeo(N,M,n), then [ ] [ ] 2.4 Comparison of Binomial and Hypergeometric Distribution Recall we said that for a Hypergeometric distribution, the Bernoulli trials are dependent. We said that in a population of 30 men and 10 women, the probability of selecting a man on the second pick depended on the gender of the first pick. We said that if a man was picked first, there was a 29/39 probability of selecting a man second. If a woman was picked first, there was a 30/39 probability of selecting a man second. The difference between 29/39 and 30/39 is not that great. There is dependence, but is it enough to make a difference? In fact, the answer is no, that the dependence between the picks is sufficiently small that we may ignore it. As a general rule, the larger the population, the easier it is to ignore the dependence. If, for example, class consisted of 4000 individuals with 1000 women, the difference between 3000/3999 and 2999/3999 is extremely small ( versus ). For large populations, the difference between the probabilities is so small that often we completely ignore the dependence and use the binomial distribution. In voter polling, often there are millions of voters. Although technically the correct distribution is Hypergeometric, the Binomial is always used to compute margins of error. Suppose the class consists of 25 individuals with 5 ones and you select 5 individuals. Let X be the number of ones observed in the sample. The possible values of X are the integers zero through five. The Hypergeometric distribution is the exact distribution, since we are drawing without replacement 7

8 from a finite population. We may find the hypergeometric probability directly from the definition, such as The full set of probabilities is shown in the second row of the table. We could approximate these probabilities by a Binomial distribution. We are sampling n=5 individuals from a population that contains p=5/25=0.2 proportion of ones. The binomial probabilities are shown in the fifth row of the table. The values are not very similar, although they are not vastly different, either. The accuracy of the binomial distribution increases as N increases (with the proportions remaining equal). Suppose the class consisted of 250 individuals with 50 ones (p=0.2) and we sample 5 individuals. The hypergeometric probabilities are shown in the third row of the table and match the binomial probabilities fairly close. As N is increased further, to 2500, also with M=500 so p=0.2, the probabilities are almost exactly equal. Distribution HyperGeo(N=25,M=5,n=5) HyperGeo(N=250,M=50,n=5) HyperGeo(N=2500,M=500,n=5) Bin(n=5,p=0.2) As a rule of thumb, HyperGeo(N,M,n) may be approximated by a Bin(n,p=M/N) if (n/n) 0.05, so we are sampling no more than 5% of the population. When this approximation holds, the mean and variance of the approximating Binomial are close to the mean and variance of the original Hypergeometric distribution. Recall the expectation of a Binomial distribution is np, which for the approximation is nm/n. This is exactly equal to the expectation of the Hypergeometric. The variance of the binomial is np(1-p), which for the approximation is Compare this to the exact variance of the Hypergeometric [ ] The only difference between the exact variance and the approximate is the term (N-n)/(N-1). However, remember we are using this approximation for n/n 0.05, which would make the (N-n)/(N-1) term approximately Negative Binomial Distribution So far we have concentrated on collecting a sample of n Bernoulli random variables and counting the number of ones in the sample. Another experiment we could perform is to sequentially observe Bernoulli random variables until we observe the first one, or the first r ones. For simplicity, we first describe observing Bernoulli s until we observe the first one. After observing the first one, we let a random variable Y be the number of zeros observed. For example, suppose the first three Bernoulli trials are X1=0, X2=0, and X3=1. The first one appeared on the third trial, and we observed Y=2 zeros before the first one. The possible values of Y are any nonnegative integers 0,1,2,. It could take an arbitrarily long time to observe the first one. Usually we will observe only the first one fairly quickly, so large values of Y have small probabilities, but they are possible. 8

9 To complete the description of the distribution, we also require the probabilities for each of the possible outcomes. If the Bernoulli trials we observe are independent and have the same probability p, this is relatively easy to calculate. To find P(Y=0), we must realize that the event Y=0 corresponds to observing no zeros before the first one. So Y=0 is the same event as X1=1, which we know occurs with probability p. The event Y=1 occurs if we observe one zero before the first one, so X1=0 and X2=1. This occurs with probability p(1-p). In general, the event Y=y corresponds to observing y zeros before the first one, so X1 through Xy are all 0 while Xy+1=1. This occurs with probability p(1-p) y. This distribution, the number of zeros until the first one, is called the Geometric distribution. More generally, instead of looking at the number of zeros observed before the first one, we may look at the number of zeros before the second one, or third one, or r th one. This is called the Negative Binomial distribution. The Geometric distribution is a Negative Binomial distribution with r=1. As with the geometric, if Y is the number of zeros before the r th one, the possible values are all nonnegative integers. We might find r ones in the first r trials, and therefore observe no zeros. Similarly, it could take an arbitrarily large number of Bernoulli s before the r th one. Suppose we are waiting for the 5 th one. What is the probability we observe exactly 7 zeros before the 5 th one? We calculate this probability by noticing that the event The 5 th one occurs after exactly 7 zeros requires that we observe 12 Bernoulli trials, with 5 ones and 7 zeros, and that the 12 th Bernoulli trial must have been a one, since we are waiting for the 5 th one to appear. Therefore, the first 11 Bernoulli s consist of 7 zeros and 4 ones, and the 12 th Bernoulli is a one. There are no restrictions on the order of ones and zeros for the first 11 Bernoulli s, so this occurs with probability This quantity must be multiplied by the probability that the 12 th Bernoulli is a one, which is p, so the actual probability is In general, we want the probability of observing y zeros before we observe the r th one. For the event Y=y to occur, we must observe y+r Bernoulli s, the first y+r-1 Bernoulli s consisting of y zeros and r-1 ones, and then observing a one on the next Bernoulli trial. The probability is The mean and variance of the negative binomial distribution are E[Y]=r(1-p)/p and V[Y]=r(1-p)/p 2. 3 Recognizing Distributions This handout discusses five distributions (Bernoulli, Binomial, Hypergeometric, Poisson, and Negative Binomial). One skill you should have is the ability to look at a word problem and determine which distribution is appropriate. After that, of course, you need to be able to apply formulas to find probabilities, means, and variances. The Bernoulli distribution is the building block of all other distributions. It should be the simplest to recognize, since it has just two outcomes, 0 and 1, which occur with probabilities p and 1-p. Bernoulli s many be combined in two ways in this course. Either we sample a group of n Bernoulli s from a population and count the number of ones in the sample, or we observe a sequence of Bernoulli s until we observe r ones and then count the number of zeros before the r th one occurred. The Binomial, 9

10 Hypergeometric, and Poisson distributions describe counting the number of ones while the Negative Binomial distribution describes counting how many zeros until the r th one. If we are sampling n independent Bernoulli trials, all with the same probability p of success, then you should use a Bin(n,p) distribution. If you are randomly sampling n individuals from a population with M successes and N-M failures, use a HyperGeo(N,M,n) distribution. If you are doing neither, consult your friendly local statistician. There are two approximations we consider in this course. If the exact distribution is Bin(n,p) but n is large and p is small, the Poi( =np) distribution is an adequate approximation. If the exact distribution is HyperGeo(N,M,n) but n/n 0.05, then the Bin(n,p=M/N) distribution is an adequate approximation. 10

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS A The Hypergeometric Situation: Sampling without Replacement In the section on Bernoulli trials [top of page 3 of those notes], it was indicated

More information

Binomial random variable

Binomial random variable Binomial random variable Toss a coin with prob p of Heads n times X: # Heads in n tosses X is a Binomial random variable with parameter n,p. X is Bin(n, p) An X that counts the number of successes in many

More information

Chapters 3.2 Discrete distributions

Chapters 3.2 Discrete distributions Chapters 3.2 Discrete distributions In this section we study several discrete distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more. For

More information

Binomial and Poisson Probability Distributions

Binomial and Poisson Probability Distributions Binomial and Poisson Probability Distributions Esra Akdeniz March 3, 2016 Bernoulli Random Variable Any random variable whose only possible values are 0 or 1 is called a Bernoulli random variable. What

More information

Lecture 20 Random Samples 0/ 13

Lecture 20 Random Samples 0/ 13 0/ 13 One of the most important concepts in statistics is that of a random sample. The definition of a random sample is rather abstract. However it is critical to understand the idea behind the definition,

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211) An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010 Chapter 3 Discrete Random

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until

More information

Random Models. Tusheng Zhang. February 14, 2013

Random Models. Tusheng Zhang. February 14, 2013 Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics: Executive Assessment Math Review Although the following provides a review of some of the mathematical concepts of arithmetic and algebra, it is not intended to be a textbook. You should use this chapter

More information

Chapter 1 Review of Equations and Inequalities

Chapter 1 Review of Equations and Inequalities Chapter 1 Review of Equations and Inequalities Part I Review of Basic Equations Recall that an equation is an expression with an equal sign in the middle. Also recall that, if a question asks you to solve

More information

Quick review on Discrete Random Variables

Quick review on Discrete Random Variables STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 2017 Néhémy Lim Quick review on Discrete Random Variables Notations. Z = {..., 2, 1, 0, 1, 2,...}, set of all integers; N = {0, 1, 2,...}, set of natural

More information

Geometric Distribution The characteristics of a geometric experiment are: 1. There are one or more Bernoulli trials with all failures except the last

Geometric Distribution The characteristics of a geometric experiment are: 1. There are one or more Bernoulli trials with all failures except the last Geometric Distribution The characteristics of a geometric experiment are: 1. There are one or more Bernoulli trials with all failures except the last one, which is a success. In other words, you keep repeating

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

Topic 9 Examples of Mass Functions and Densities

Topic 9 Examples of Mass Functions and Densities Topic 9 Examples of Mass Functions and Densities Discrete Random Variables 1 / 12 Outline Bernoulli Binomial Negative Binomial Poisson Hypergeometric 2 / 12 Introduction Write f X (x θ) = P θ {X = x} for

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 11: Geometric Distribution Poisson Process Poisson Distribution Geometric Distribution The Geometric

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Lecture 08: Poisson and More. Lisa Yan July 13, 2018 Lecture 08: Poisson and More Lisa Yan July 13, 2018 Announcements PS1: Grades out later today Solutions out after class today PS2 due today PS3 out today (due next Friday 7/20) 2 Midterm announcement Tuesday,

More information

Using Probability to do Statistics.

Using Probability to do Statistics. Al Nosedal. University of Toronto. November 5, 2015 Milk and honey and hemoglobin Animal experiments suggested that honey in a diet might raise hemoglobin level. A researcher designed a study involving

More information

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Basic Probability. Introduction

Basic Probability. Introduction Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with

More information

2.6 Tools for Counting sample points

2.6 Tools for Counting sample points 2.6 Tools for Counting sample points When the number of simple events in S is too large, manual enumeration of every sample point in S is tedious or even impossible. (Example) If S contains N equiprobable

More information

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or Expectations Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) Expectations

More information

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial. Section 8.6: Bernoulli Experiments and Binomial Distribution We have already learned how to solve problems such as if a person randomly guesses the answers to 10 multiple choice questions, what is the

More information

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis

More information

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Chapter 3. Discrete Random Variables and Their Probability Distributions

Chapter 3. Discrete Random Variables and Their Probability Distributions Chapter 3. Discrete Random Variables and Their Probability Distributions 2.11 Definition of random variable 3.1 Definition of a discrete random variable 3.2 Probability distribution of a discrete random

More information

Lecture 3. Discrete Random Variables

Lecture 3. Discrete Random Variables Math 408 - Mathematical Statistics Lecture 3. Discrete Random Variables January 23, 2013 Konstantin Zuev (USC) Math 408, Lecture 3 January 23, 2013 1 / 14 Agenda Random Variable: Motivation and Definition

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Let us think of the situation as having a 50 sided fair die; any one number is equally likely to appear.

Let us think of the situation as having a 50 sided fair die; any one number is equally likely to appear. Probability_Homework Answers. Let the sample space consist of the integers through. {, 2, 3,, }. Consider the following events from that Sample Space. Event A: {a number is a multiple of 5 5, 0, 5,, }

More information

Confidence Intervals for the Sample Mean

Confidence Intervals for the Sample Mean Confidence Intervals for the Sample Mean As we saw before, parameter estimators are themselves random variables. If we are going to make decisions based on these uncertain estimators, we would benefit

More information

Probability Distributions

Probability Distributions EXAMPLE: Consider rolling a fair die twice. Probability Distributions Random Variables S = {(i, j : i, j {,...,6}} Suppose we are interested in computing the sum, i.e. we have placed a bet at a craps table.

More information

DS-GA 1002 Lecture notes 11 Fall Bayesian statistics

DS-GA 1002 Lecture notes 11 Fall Bayesian statistics DS-GA 100 Lecture notes 11 Fall 016 Bayesian statistics In the frequentist paradigm we model the data as realizations from a distribution that depends on deterministic parameters. In contrast, in Bayesian

More information

( ) P A B : Probability of A given B. Probability that A happens

( ) P A B : Probability of A given B. Probability that A happens A B A or B One or the other or both occurs At least one of A or B occurs Probability Review A B A and B Both A and B occur ( ) P A B : Probability of A given B. Probability that A happens given that B

More information

Chapter 3 Discrete Random Variables

Chapter 3 Discrete Random Variables MICHIGAN STATE UNIVERSITY STT 351 SECTION 2 FALL 2008 LECTURE NOTES Chapter 3 Discrete Random Variables Nao Mimoto Contents 1 Random Variables 2 2 Probability Distributions for Discrete Variables 3 3 Expected

More information

Some Special Discrete Distributions

Some Special Discrete Distributions Mathematics Department De La Salle University Manila February 6, 2017 Some Discrete Distributions Often, the observations generated by different statistical experiments have the same general type of behaviour.

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

Bernoulli Trials, Binomial and Cumulative Distributions

Bernoulli Trials, Binomial and Cumulative Distributions Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Sociology 6Z03 Topic 10: Probability (Part I)

Sociology 6Z03 Topic 10: Probability (Part I) Sociology 6Z03 Topic 10: Probability (Part I) John Fox McMaster University Fall 2014 John Fox (McMaster University) Soc 6Z03: Probability I Fall 2014 1 / 29 Outline: Probability (Part I) Introduction Probability

More information

Math/Stat 352 Lecture 8

Math/Stat 352 Lecture 8 Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number

More information

1 Normal Distribution.

1 Normal Distribution. Normal Distribution.. Introduction A Bernoulli trial is simple random experiment that ends in success or failure. A Bernoulli trial can be used to make a new random experiment by repeating the Bernoulli

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

Confidence Intervals for the Mean of Non-normal Data Class 23, Jeremy Orloff and Jonathan Bloom

Confidence Intervals for the Mean of Non-normal Data Class 23, Jeremy Orloff and Jonathan Bloom Confidence Intervals for the Mean of Non-normal Data Class 23, 8.05 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to derive the formula for conservative normal confidence intervals for the proportion

More information

Introduction to Statistical Data Analysis Lecture 4: Sampling

Introduction to Statistical Data Analysis Lecture 4: Sampling Introduction to Statistical Data Analysis Lecture 4: Sampling James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis 1 / 30 Introduction

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Lecture 2: Discrete Probability Distributions

Lecture 2: Discrete Probability Distributions Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

37.3. The Poisson Distribution. Introduction. Prerequisites. Learning Outcomes

37.3. The Poisson Distribution. Introduction. Prerequisites. Learning Outcomes The Poisson Distribution 37.3 Introduction In this Section we introduce a probability model which can be used when the outcome of an experiment is a random variable taking on positive integer values and

More information

CIS 2033 Lecture 5, Fall

CIS 2033 Lecture 5, Fall CIS 2033 Lecture 5, Fall 2016 1 Instructor: David Dobor September 13, 2016 1 Supplemental reading from Dekking s textbook: Chapter2, 3. We mentioned at the beginning of this class that calculus was a prerequisite

More information

1 Bernoulli Distribution: Single Coin Flip

1 Bernoulli Distribution: Single Coin Flip STAT 350 - An Introduction to Statistics Named Discrete Distributions Jeremy Troisi Bernoulli Distribution: Single Coin Flip trial of an experiment that yields either a success or failure. X Bern(p),X

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

4. Discrete Probability Distributions. Introduction & Binomial Distribution

4. Discrete Probability Distributions. Introduction & Binomial Distribution 4. Discrete Probability Distributions Introduction & Binomial Distribution Aim & Objectives 1 Aims u Introduce discrete probability distributions v Binomial distribution v Poisson distribution 2 Objectives

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Lecture 8 : The Geometric Distribution

Lecture 8 : The Geometric Distribution 0/ 24 The geometric distribution is a special case of negative binomial, it is the case r = 1. It is so important we give it special treatment. Motivating example Suppose a couple decides to have children

More information

Chapter 5. Means and Variances

Chapter 5. Means and Variances 1 Chapter 5 Means and Variances Our discussion of probability has taken us from a simple classical view of counting successes relative to total outcomes and has brought us to the idea of a probability

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Ch. 7: Estimates and Sample Sizes

Ch. 7: Estimates and Sample Sizes Ch. 7: Estimates and Sample Sizes Section Title Notes Pages Introduction to the Chapter 2 2 Estimating p in the Binomial Distribution 2 5 3 Estimating a Population Mean: Sigma Known 6 9 4 Estimating a

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

Solving Equations by Adding and Subtracting

Solving Equations by Adding and Subtracting SECTION 2.1 Solving Equations by Adding and Subtracting 2.1 OBJECTIVES 1. Determine whether a given number is a solution for an equation 2. Use the addition property to solve equations 3. Determine whether

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Carolyn Anderson & YoungShil Paek (Slide contributors: Shuai Wang, Yi Zheng, Michael Culbertson, & Haiyan Li)

Carolyn Anderson & YoungShil Paek (Slide contributors: Shuai Wang, Yi Zheng, Michael Culbertson, & Haiyan Li) Carolyn Anderson & YoungShil Paek (Slide contributors: Shuai Wang, Yi Zheng, Michael Culbertson, & Haiyan Li) Department of Educational Psychology University of Illinois at Urbana-Champaign 1 Inferential

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

STAT509: Discrete Random Variable

STAT509: Discrete Random Variable University of South Carolina September 16, 2014 Motivation So far, we have already known how to calculate probabilities of events. Suppose we toss a fair coin three times, we know that the probability

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables 1 Monday 9/24/12 on Bernoulli and Binomial R.V.s We are now discussing discrete random variables that have

More information

Probability and Discrete Distributions

Probability and Discrete Distributions AMS 7L LAB #3 Fall, 2007 Objectives: Probability and Discrete Distributions 1. To explore relative frequency and the Law of Large Numbers 2. To practice the basic rules of probability 3. To work with the

More information

1 INFO Sep 05

1 INFO Sep 05 Events A 1,...A n are said to be mutually independent if for all subsets S {1,..., n}, p( i S A i ) = p(a i ). (For example, flip a coin N times, then the events {A i = i th flip is heads} are mutually

More information

Lecture 7: Confidence interval and Normal approximation

Lecture 7: Confidence interval and Normal approximation Lecture 7: Confidence interval and Normal approximation 26th of November 2015 Confidence interval 26th of November 2015 1 / 23 Random sample and uncertainty Example: we aim at estimating the average height

More information

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables.

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables. Stat 260 - Lecture 20 Recap of Last Class Last class we introduced the covariance and correlation between two jointly distributed random variables. Today: We will introduce the idea of a statistic and

More information

Bernoulli Trials and Binomial Distribution

Bernoulli Trials and Binomial Distribution Bernoulli Trials and Binomial Distribution Sec 4.4-4.5 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 10-3339 Cathy Poliak, Ph.D. cathy@math.uh.edu

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

Chapter 3: Discrete Random Variable

Chapter 3: Discrete Random Variable Chapter 3: Discrete Random Variable Shiwen Shen University of South Carolina 2017 Summer 1 / 63 Random Variable Definition: A random variable is a function from a sample space S into the real numbers.

More information

Discrete Random Variables (1) Solutions

Discrete Random Variables (1) Solutions STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 06 Néhémy Lim Discrete Random Variables ( Solutions Problem. The probability mass function p X of some discrete real-valued random variable X is given

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics March 14, 2018 CS 361: Probability & Statistics Inference The prior From Bayes rule, we know that we can express our function of interest as Likelihood Prior Posterior The right hand side contains the

More information

ST 371 (IX): Theories of Sampling Distributions

ST 371 (IX): Theories of Sampling Distributions ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about

More information

Chapter 4a Probability Models

Chapter 4a Probability Models Chapter 4a Probability Models 4a.2 Probability models for a variable with a finite number of values 297 4a.1 Introduction Chapters 2 and 3 are concerned with data description (descriptive statistics) where

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips 6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips Hey, everyone. Welcome back. Today, we're going to do another fun problem that

More information

STAT Chapter 13: Categorical Data. Recall we have studied binomial data, in which each trial falls into one of 2 categories (success/failure).

STAT Chapter 13: Categorical Data. Recall we have studied binomial data, in which each trial falls into one of 2 categories (success/failure). STAT 515 -- Chapter 13: Categorical Data Recall we have studied binomial data, in which each trial falls into one of 2 categories (success/failure). Many studies allow for more than 2 categories. Example

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Test 3 SOLUTIONS. x P(x) xp(x)

Test 3 SOLUTIONS. x P(x) xp(x) 16 1. A couple of weeks ago in class, each of you took three quizzes where you randomly guessed the answers to each question. There were eight questions on each quiz, and four possible answers to each

More information

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2 Probability Density Functions and the Normal Distribution Quantitative Understanding in Biology, 1.2 1. Discrete Probability Distributions 1.1. The Binomial Distribution Question: You ve decided to flip

More information