Statistics. Joe Erickson

Size: px
Start display at page:

Download "Statistics. Joe Erickson"

Transcription

1 Statistics Joe Erickson

2 Contents 1 Introduction to Statistics Basic Definitions Data and Measurement Experimental Design Describing Data with Tables & Graphs Frequency Distributions and Their Tables Graphs of Frequency Distributions Stem-and-Leaf and Dot Plots Pie and Pareto Charts Graphing Paired Data Sets Numerical Data Descriptors Measures of Center Measures of Dispersion Measures of Position Probability Theory Fundamental Counting Principle Definition of Probability Conditional Probability The Multiplicative and Additive Laws Combinations and Permutations Combinatorial Probability Discrete Probability Distributions Discrete Random Variables Expected Value: Discrete Case The Binomial Distribution The Geometric Distribution The Hypergeometric Distribution The Poisson Distribution Continuous Probability Distributions Continuous Random Variables Expected Value: Continuous Case The Uniform Distribution The Normal Distribution The Gamma Distribution The Beta Distribution

3 1 1 Introduction to Statistics 1.1 Basic Definitions The notion of a variable is taken to be understood by the reader who has had at least one course in basic algebra. The values assumed by a variable X, without regard to order but counting repetitions of the same value, are called data (singular datum). Any collection of data is a data set, and if a data set consists of the values x 1, x 2,..., x n (counting repetitions of the same value), then the data set may be denoted by [[x 1, x 2,..., x n ]]. The range of a variable X is the set of all possible distinct values that X can assume. A very important point: a given data set associated with a variable X may feature some values in the range of X multiple times, while other values do not appear at all. A data set is not really a set in the usual mathematical sense, but rather a multiset. The difference between sets and multisets is simple: sets recognize neither repetition nor order among the objects they contain, whereas multisets do recognize repetitions of objects but are still blind to order. For example, {1, 2}, {1, 1, 1, 2, 2}, and {2, 1} are all considered to be the same set: namely, the set containing precisely the elements 1 and 2, and we write {1, 2} = {1, 1, 1, 2, 2} = {2, 1}. In contrast, [[1, 2]] and [[1, 1, 1, 2, 2]] are very different data sets: [[1, 2]] indicates a variable X assumed the value 1 once, and the value 2 once (in no particular order); [[1, 1, 1, 2, 2]] indicates that X assumed the value 1 three times, and the value 2 twice (again in no particular order). We write [[1, 2]] [[1, 1, 1, 2, 2]]. On the other hand [[1, 1, 1, 2, 2]] and [[2, 1, 2, 1, 1]] are considered the same data set, since order still does not matter to a multiset! We write [[1, 1, 1, 2, 2]] = [[2, 1, 2, 1, 1]].

4 Example 1.1. Let X represent the score that turns up when a six-sided die is rolled. Then the range of X equals the set {1, 2, 3, 4, 5, 6}. Now, by rolling the die at least once, X can be used to generate a data set. So for instance we may roll the die sixteen times and generate the following data set: D = [[2, 2, 1, 5, 6, 1, 4, 2, 6, 1, 6, 4, 2, 6, 1, 2]]. Note that the data set D counts repetitions of the same value. In this case we see that not all of the values in the range of X appear in D (3 is missing), while other values in the range of X appear multiple times in D. Again, data sets are not sensitive to order. If we rearrange the data set D in an ascending order, [[1, 1, 1, 1, 2, 2, 2, 2, 2, 4, 4, 5, 6, 6, 6, 6]], 2 it is considered to be the same data set (i.e. equal to D). Definition 1.2. Statistics is the study of the collection, organization, presentation, analysis, and interpretation of data. There are two types of data sets associated with any variable, and the differences between these two types strike to the heart of what statistics is all about. Definition 1.3. Let X be a variable. A population for X is a data set associated with X that contains precisely all of the data that is of interest. A sample for X is any data set associated with X that is contained within a population for X. Example 1.4. Let X be a variable whose range equals the set {Stein, Trump, Johnson, Clinton, Other}. Thus the values of X can be taken to represent the response given when someone is asked who they would like to see elected U.S. President in 216. What we wish to do is predict who will win the election before the election happens. This naturally requires the gathering of data. To gather the best data possible, we need to rigorously define the population for X. Clearly the population for X cannot be the survey responses of everyone in the world, for not everyone in the world is eligible to vote in U.S. elections. The population for X cannot even be the survey responses of all U.S. citizens of voting age, for not all such individuals bother to vote. Arguably the population for X should be taken to be the survey responses of all eligible U.S. voters who actually intend to vote on election day. Of course, there is a problem: it is not practical to take a survey of all the over 125,, eligible U.S. voters who intend to vote on election day. Any polling firm, therefore, must instead gather a sample for X, usually comprised of a few hundred or a few thousand survey responses. The challenge is to get a sample that accurately reflects the population as a whole. It is a challenge that will likely keep statisticians comfortably employed until the end of history. It bears repeating: the differences between a population and a sample strike to the heart of what statistics is all about.

5 In Example 1.4 we could say, somewhat less formally, that the population is the set of all eligible U.S. voters who intend to vote on election day, and any sample is some subset of the population. In this way the population for X in the example becomes a collection of human individuals rather than a data set comprised of values in the range of X. There is nothing wrong with taking this approach, since there is a direct correspondence between each human individual in the population and a value in the range of X. There are two branches of statistics. Descriptive statistics is the branch concerned with the organization and presentation of data, while inferential statistics is concerned with using sample data to draw conclusions about a population. What is mentioned in the definition of statistics which is not mentioned in the definitions of descriptive and inferential statistics? The business of collecting data. Collecting good data is tremendously important, but is neither descriptive nor inferential. Any numerical description of a population characteristic is called a parameter, while any numerical description of a sample characteristic is a statistic. Common sorts of parameters and statistics are means, medians, modes, standard deviations, percentages of a total, and the like Data and Measurement 1.3 Experimental Design

6 4 2 Describing Data with Tables & Graphs 2.1 Frequency Distributions and Their Tables Example 2.1. The average daily number of sunspots observed from Earth on the sun s disk for the years 1976 to 215 are as follows: We shall make a frequency distribution for this data set using nine classes. 2. The minimum data entry is 4.2, and the maximum data entry is The range of the data set is thus = 215.9, and so we determine the class width to be Class width = Range Number of classes = = The lower limit of the first class we shall set to be 4.2, the minimum data entry. We obtain the lower limits of the other classes by repeatedly adding the class width of 24 to the lower limit of the first class: Since our data consists of decimals to the tenths place, the upper limit of the first class must be.1 less than the lower limit of the second class, which is to say We obtain the upper limits of the other classes by repeatedly adding the class width of 24 to the upper limit of the first class: Tally the number of data entries that fall into each class. A column of tally marks can be included in the frequency distribution table, though it is not required. The leftmost

7 column of the frequency distribution table must list each of the nine classes, and the rightmost column must list the frequency of each class. 5 Class Tally Frequency ; ; ; :::: ; : :: ::: ::: :: ; 5 Example 2.2. We extend the frequency distribution constructed in Example 2.1 to include class midpoints, relative class frequencies f r, and cumulative frequencies f c. We also do away with the tally column, which holds the same information as the existing column for frequency f. Relative frequencies are given to the thousandths place in this example, since it results in no rounding. In general, if relative frequencies are given to k decimal places, then all relative frequencies should be written to k decimal places by inserting zeros as needed. Thus, in this case where relative frequencies are given to 3 decimal places, we write.25 as.25, and.1 as.1. Class Midpoint f f r f c

8 6 2.2 Graphs of Frequency Distributions A frequency histogram is a representation of a frequency distribution as a vertical bar graph. Each bar represents a class of data, and the height of the bar indicates the frequency f of the class. Example 2.3. We now make a frequency histogram of the data in Example 2.1. The numbers in the frequency column of the frequency distribution table are measured by the vertical scale, and the data entries are measured by the horizontal scale. In particular, as described above, the horizontal scale can be rendered in two ways: using class boundaries or class midpoints. First we shall render the horizontal scale with class boundaries. Looking at the class column of the frequency distribution table in Example 2.1, where the class limits are given, we see that the gap between the first class and the second class is =.1. Half of this value is.5, and so we subtract.5 from every lower class limit to obtain lower class boundaries, and we add.5 to every upper class limit to obtain upper class boundaries. Class Limits Class Boundaries Class Midpoint Frequency In this way we obtain the frequency histogram shown in Figure 2, with the zig-zag near the left end of the horizontal axis used to denote a break in the scale: the actual distance between and 4.15 on the horizontal scale should be smaller than depicted. Next we render the horizontal scale with class midpoints. We found the class midpoints already in Example 2.2, and they are included for reference in the table above. The resultant frequency histogram, shown in Figure 1, otherwise looks the same as before. A relative frequency histogram is nothing more than a frequency histogram in which the vertical scale measures relative frequency f r rather than frequency f. Example 2.4. A relative frequency histogram of the data in Example 2.1 is shown in Figure 3. The horizontal scale still measures the data entries and can feature either class boundaries or class midpoints.

9 7 Average daily number of sunspots Frequency (number of years) Average Figure 1. Frequency histogram with horizontal axis featuring class midpoints. Average daily number of sunspots Frequency (number of years) Average Figure 2. Frequency histogram with horizontal axis featuring class boundaries.

10 8 Average daily number of sunspots in a year.25 Relative Frequency (fraction of years) Average Figure 3. Relative frequency histogram.

11 9 2.3 Stem-and-Leaf and Dot Plots Example 2.5. The following data represent the daily high temperatures (in degrees Fahrenheit) during April 23 for Willow Grove: A stem-and-leaf plot for this data set using six stems is as follows: Stem Leaves Key: 3 6 = Pie and Pareto Charts 2.5 Graphing Paired Data Sets

12 1 3 Numerical Data Descriptors 3.1 Measures of Center A measure of center for a data set D (otherwise known as an average of the data in D) gives a value that is, in some sense, the center of D. The most common measures of center are the arithmetic mean, the median, and the mode. We will also consider measures of center known as the weighted arithmetic mean and the midrange. Definition 3.1. The arithmetic mean of a population of size N is µ = x 1 + x x N N = 1 N where x 1, x 2,..., x N are the values in the population. The arithmetic mean of a sample of size n is x = x 1 + x x n n where x 1, x 2,..., x n are the values in the sample. = 1 n N k=1 n k=1 x k = Σx N, x k = Σx n, Henceforth we shall refer to the arithmetic mean as simply the mean. There exist other kinds of means such as the geometric mean, harmonic mean, and so on. Of course, the two formulas in Definition 3.1 convey the same idea, with the only differences being in terminology and notation. If a data set is a population, then the mean is denoted by the symbol µ (the Greek letter mu, pronounced mew ); and if a data set is a sample (i.e. a portion of a population), then the mean is denoted by x ( x-bar ). In Definition 3.1 the defining expressions for µ and x are each written in three ways, in increasing order of compactness. The symbol Σx in the rightmost (most compact) expressions always denotes the sum of all the values in a data set, no matter whether it is a population or a sample. Before defining the median of a data set, we first introduce some new notation. Given a data set D = [[x 1,..., x n ]], we often need to rearrange the data in ascending order. Symbolically we may do this by relabeling the values in D using the symbols x 1,..., x n. Here x 1 denotes the

13 smallest of the values in D, while x 2 denotes the second smallest value, x 3 the third smallest value, and so on until we reach the largest value x n. Thus we have x 1 < x 2 < x 3 < < x n. Since data sets are not sensitive to order, [[x 1,..., x n]] is considered to be the same data set as D = [[x 1,..., x n ]]. Example 3.2. If D = [[x 1, x 2, x 3, x 4, x 5, x 6 ]] is a data set with x 1 = 6, x 2 = 1, x 3 = 8, x 4 =, x 5 = 6, x 6 = 3, then: x 1 = 1, x 2 =, x 3 = 3, x 4 = 6, x 5 = 6, x 6 = 8. Definition 3.3. The median of the data set [[x 1,..., x n ]] is defined as follows: x n/2+1/2, if n is odd Median = x n/2 + x n/2+1, 2 if n is even Example 3.4. Find the median of the data set [[6, 4, 9, 7, 3, 5, 4, 2, 4, 9, 9, 6, 7, 3, 3]]. Solution. The size of the data set is n = 15, which of course is an odd number. Calculating n/2 + 1/2 = 15/2 + 1/2 = 8, by Definition 3.3 the median of the data set equals x 8. Writing out the data set in ascending order gives which makes clear that [[ 7, 3, 2, 3, 3, 4, 4, 4, 5, 6, 6, 7, 9, 9, 9]], x 1 = 7, x 2 = 3, x 3 = 2, x 4 = x 5 = 3, x 6 = x 7 = x 8 = 4, x 9 = 5, x 1 = x 11 = 6, x 12 = 7, x 13 = x 14 = x 15 = 9, 11 and hence the median is 4. Definition 3.5. Let {t 1,..., t m } denote the set of distinct values contained in a data set D. For each 1 i m let f i denote the frequency with which the value t i occurs in D, and let f equal the largest of the frequencies f 1,..., f m. The modes of D are the elements of {t 1,..., t m } having frequency equal to f. Example 3.6. Find the modes of the data set [[6, 4, 9, 7, 3, 5, 4, 2, 4, 9, 9, 6, 7, 3, 3]]. Solution. The set of distinct values in the data set is { 7, 3, 2, 3, 4, 5, 6, 7, 9}. The values 7, 3, 2, 5, and 7 each appear in the data set precisely once, and so each have a frequency of 1. The values 3 and 6 each appear twice, so each have frequency 2. Finally, 4 and 9 each have frequency 3. The largest frequency is thus f = 3, and since the values 4 and 9 each have frequency equal to f = 3, both 4 and 9 are the mode of the data set.

14 In general, a data set having one mode is called unimodal, and a data set with more than one mode is multimodal. In particular a data set with two modes is bimodal, which is the case in Example Definition 3.7. For each x in a data set let w R be an associated weight. weighted mean of the data set is x = Σ(wx) Σw. Then the Example 3.8. A grade point average (GPA) is a weighted mean of grade points received for each credit of coursework completed. An academic course credit given a grade of A, B, C, D, or F is awarded 4, 3, 2, 1, or grade points, respectively (at least at institutions that append no + or to any letter grade). The weight associated with each of the five possible grade point values is simply the number of course credits that are awarded that value. As an example, suppose Wolfgang Jägermann got an A in 1 five-credit course, a C in 2 three-credit courses, a B in 1 three-credit course, and a D in 1 two-credit course. Then his GPA is determined as follows, where we let cr stand for credits and gp stand for grade points : (5 cr)(4 gp) + (6 cr)(2 gp) + (3 cr)(3 gp) + (2 cr)(1 gp) GPA = = gp. 5 cr + 6 cr + 3 cr + 2 cr Thus Herr Jägermann received a mean of approximately 2.69 grade points per credit. 3.2 Measures of Dispersion

15 Measures of Position Example 3.9. The following data represent the daily high temperatures (in degrees Fahrenheit) during April 23 for Willow Grove: (a) Find the quartiles Q 1, Q 2, and Q 3. (b) Construct a box-and-whisker plot Solution. (a) The Median is ( )/2 = 61.5, and so Q 2 = Now we divide the data set into two halves: its smallest 15 values and its largest 15 values. The smallest 15 values in ascending order are 36, 37, 39, 42, 46, 47, 47, 47, 48, 49, 53, 56, 58, 61, 61. The median of the smallest 15 values is Q 1 = 47. The largest 15 values in ascending order are 62, 62, 65, 65, 65, 66, 66, 68, 69, 72, 73, 75, 77, 79, 84. The median of the largest 15 values is Q 3 = 68. Thus: Q 1 = 47, Q 2 = 61.5, Q 3 = 68. (b) The box-and-whisker plot:

16 14 4 Probability Theory 4.1 Fundamental Counting Principle Definition 4.1. An experiment E is a predetermined procedure by which an observation is made. The procedure consists of one or more performances of a single unvarying action called a trial. When an experiment E is done, we speak of performing E. The individual trials constituting an experiment E are experiments in their own right, so that E may be viewed as a succession of simpler experiments. The precise number of trials constituting an experiment is not necessarily known in advance, since the experiment s predetermined procedure may call for repeating an action until some desired result is obtained. Example 4.2. A single roll of a six-sided die is an experiment, and it consists of just one trial: namely, a single roll of a six-sided die. There is no meaningful way to decompose the procedure of rolling one die one time into a sequence of simpler actions. Ten successive rolls of a die is also an experiment, only this time the experiment can be said to consist of ten trials: each roll is one trial. What is observed to occur when an experiment is conducted we tentatively refer to as a result. There are usually many ways to describe the result of an experiment. If an experiment consists of a single roll of a six-sided die and the result is a 5, we could say the outcome was five, or an odd number, or a number greater than 2, among other possibilities. Only the first characterization of the experimental result, that of getting a five, makes clear which side of the die was actually observed facing upward when the experiment was finished. To say the result was an odd number leaves open the possibility of a 1, 3, or 5 having actually turned up, with no additional information at hand to narrow the choices down. For the next definition recall that a set A is a subset of a set B, written A B, if x A implies that x B. Put another way, A is a subset of B if every element of A is also an element of B. This means, in particular, that any set is a subset of itself: A A. Definition 4.3. An outcome of an experiment is a description of an experimental result that loses no information gained by the original observation. The sample space Ω of an experiment

17 is the set of all the experiment s possible outcomes, and an event is any subset of the sample space. In the literature the elements of an experiment s sample space are often referred to as sample points rather than outcomes. A simple event is an event that contains precisely one outcome of an experiment. Let Ω = {ω 1, ω 2,..., ω n } be the sample space of an experiment. Among the n elements of Ω, chose any m 1 of them: ω k1, ω k2,..., ω km. (Here k 1, k 2,..., k m are numbers drawn from the set {1, 2,..., n}, where of course m n.) Let E be the set containing these elements: E = {ω k1, ω k2,..., ω km }. Then E is a subset of Ω, we write E Ω, and we call E an event in accord with Definition 4.3. Specifically, E is the event in which one of the outcomes ω k1, ω k2,..., ω km is observed to occur upon carrying out the experiment a single time. Note that since Ω is a subset of itself (i.e. Ω Ω), the set Ω is itself an event: namely, the event that one of the possible outcomes of the experiment occurs upon carrying out the experiment a single time. This is always certain to happen, and so Ω is called the certain event. The empty set, which is the set that contains no elements, is considered to be a subset of any set, so it is certainly a subset of any experiment s sample space Ω and therefore constitutes an event. It is specifically the event that none of the outcomes contained in Ω are observed to occur upon carrying out the experiment a single time. Of course, this implies that an outcome occurred that is not among the outcomes contained in Ω, and since by definition Ω contains all possible outcomes, we have arrived at a contradiction. For this reason is called the impossible event. Two events E 1, E 2 Ω are mutually exclusive if E 1 E 2 =. This means that the simultaneous occurrence of events E 1 and E 2 is impossible. Example 4.4. Suppose an experiment consists of a single toss of a coin, so as to see which side of the coin faces up when it comes to rest. There are two possible outcomes: heads or tails. If we let H stand for heads and T stand for tails, then we can see that the sample space of the experiment is Ω = {H, T }. There are only four possible events in this scenario, corresponding to the four possible subsets of Ω:, {H}, {T }, {H, T }. The set corresponds to the event of a coin toss resulting in neither heads nor tails, which cannot occur. The set {H, T }, which equals Ω itself, is the event of a coin toss resulting in either heads or tails, which is sure to occur. The set {H} is the event of observing heads but not tails, while {T } is the event of observing tails but not heads. Example 4.5. Suppose an experiment consists of rolling a single eight-sided die having sides numbered from 1 to 8. The sample space of the experiment is Ω = {1, 2, 3, 4, 5, 6, 7, 8}. The number of possible events that can occur equals the number of subsets that Ω has, which happens to be 256 too many to easily list. The event {2, 4, 6, 8} could be described as the event of rolling an even number, while the event {2, 3, 5, 7} could be described as the event of rolling a prime number. How might we describe the event {6, 7, 8} or {1, 4}? 15

18 Another set theoretic notion is needed for what s coming. The Cartesian product of two nonempty sets A and B is the set of ordered pairs Thus if A = {$, %, #} and B = {α, β}, then and A B = { (a, b) : a A and b B }. A B = { ($, α), ($, β), (%, α), (%, β), (#, α), (#, β) } B A = { (α, $), (β, $), (α, %), (β, %), (α, #), (β, #) }. Note that A B B A since, for instance, ($, α) (α, $). More generally, given nonempty sets A 1, A 2,..., A n, we have A 1 A 2 A m = { (a 1, a 2,..., a n ) : a k A k for each 1 k n }. Definition 4.6. Suppose experiments E 1 and E 2 have sample spaces Ω 1 and Ω 2, respectively. The experiment consisting of performing E 1 and E 2 in sequence, denoted by E 1 E 2, is the experiment having sample space Ω 1 Ω 2. More generally, if E k has sample space Ω k for each 1 k n, then the experiment E 1 E n consisting of performing E 1,..., E n in sequence is the experiment having sample space Ω 1 Ω n. Example 4.7. Let E 1 be the experiment consisting of a single toss of a coin, as in Example 4.4, and let E 2 be the experiment consisting of a single roll of an eight-sided die, as in Example 4.5. The experiment E consisting of performing E 1 and E 2 in sequence is the experiment consisting of first tossing the coin and then rolling the die. The sample space of E 1 is Ω 1 = {H, T }, while the sample space of E 2 is Ω 2 = {1, 2, 3, 4, 5, 6, 7, 8}. Thus, by definition, the sample space of E is Ω 1 Ω 2 = { (H, 1), (H, 2), (H, 3), (H, 4), (H, 5), (H, 6), (H, 7), (H, 8), (T, 1), (T, 2), (T, 3), (T, 4), (T, 5), (T, 6), (T, 7), (T, 8) }. Each ordered pair in Ω 1 Ω 2 is an outcome of the experiment E. For instance, (T, 7) is the outcome of tossing heads and then rolling a 7. Theorem 4.8 (Fundamental Counting Principle). If experiment E k has n k outcomes for each 1 k m, then the experiment E 1 E 2 E m has n 1 n 2 n m possible outcomes. The veracity of the theorem is fairly self-evident; however, the rigorous-minded can craft a proof using the Principle of Induction. Example 4.9. If E 1 is the experiment consisting of a single toss of a coin, and E 2 is the experiment consisting of a single roll of an eight-sided die, then E 1 has n 1 = 2 outcomes (see Example 4.4), and E 2 has n 2 = 8 outcomes (see Example 4.5). By the Fundamental Counting Principle it follows that the experiment E consisting of performing E 1 and E 2 in sequence has n 1 n 2 = 16 outcomes. These outcomes are listed in Example

19 Example 4.1. An account password is known to consist of six characters with the following parameters: the first three characters are letters with possible repetition, the fourth and fifth characters are numerals without repetition, and the last character is a symbol from the set {#, $, %,!,?}. The letters are case-sensitive, so that for instance x is distinguished from X. How many possible passwords satisfy the parameters? Define the following four experiments: E 1 : Choose an uppercase or lowercase later. E 2 : Choose a numeral. E 3 : Choose a numeral not chosen in E 2. E 4 : Choose a symbol from the set {#, $, %,!,?}. A possible password is formed by performing the experiment E = E 1 E 1 E 1 E 2 E 3 E 4. Note that E 1 is performed in sequence three consecutive times, and it is important that E 2 be performed before E 3! The number of outcomes possible for E 1 is n 1 = 52: choose any one of 26 lowercase or 26 uppercase letters. The number of outcomes possible for E 2 is n 2 = 1: choose one of the numerals from to 9. For E 3 we must choose a numeral not already chosen in the course of performing E 2, and so there are n 3 = 9 outcomes. Finally, E 4 has n 4 = 5 possible outcomes. By the Fundamental Counting Principle we conclude that the number of possible outcomes for E is n 1 n 1 n 1 n 2 n 3 n 4 = = 63,273,6. Therefore there are 63,273,6 possible passwords. 17

20 Definition of Probability A set is finite if it has a finite number of elements. Thus A = {1, 2, 3} is finite (3 elements), whereas the set of positive integers N = {1, 2, 3,...} has an infinite number of elements and hence is not finite. Definition A one-to-one correspondence between the elements of one set A and another set B is a function f : A B with the following properties: 1. Injectivity: Whenever a 1, a 2 A are such that a 1 a 2, then f(a 1 ) f(a 2 ). 2. Surjectivity: For every b B there exists some a A such that f(a) = b. Remark. In the definition above, the equality f(a) = b can be interpreted as a pairing of a A with b B, and this pairing may be expressed quite naturally as the ordered pair (a, b). So, a one-to-one correspondence between sets A and B may be viewed as a relation that pairs each element of A with a unique element of B, such that no element of B is left unpaired with an element of A. A function f : A B possessing the injectivity (resp. surjectivity) property in Definition 4.11 is injective (resp. surjective). Thus a function is a one-to-one correspondence if and only if it is both injective and surjective. Example Let A = {1, 2, 3} and B = {u, v, w}. If we define the function f : A B by f(1) = v, f(2) = w, f(3) = u, then f is a one-to-one correspondence between A and B. This correspondence may be alternately expressed as the set of ordered pairs {(1, v), (2, w), (3, u)}. Now define a new function g : A B by g(1) = v, g(2) = v, g(3) = w. The function g is not a one-to-one correspondence between A and B, since for u B there is no a A such that g(a) = u, thereby violating the surjectivity property in Definition Example Let A = {1, 2, 3, 4} and B = {u, v, w}. The function h : A B by h(1) = v, h(2) = w, h(3) = w, h(4) = u, is not a one-to-one correspondence between A and B. We have 2, 3 A with 2 3, and yet h(2) = h(3) = w, which violates the injectivity property in Definition Definition A set A is countably infinite if there exists a one-to-one correspondence between the elements of A and the elements of N = {1, 2, 3,...}. A set that is finite or countably infinite is discrete. A set that is not discrete is uncountable. An uncountable set that equals an interval of real numbers is continuous. Implicit in Definition 4.14 is that any interval of real numbers such as (, 1), [ 1, 1], or (2, ) is an uncountable set. This is true, but no proof of the fact shall be given here.

21 Example Show that the set of positive even integers A = {2, 4, 6,...} is countably infinite. Solution. Define a function f : A N by f(a) = a 2 for each a A. Note that if a A, then a is a positive integer divisible by 2, so that a/2 is a positive integer and hence an element of N as required. Suppose a 1, a 2 A are such that a 1 a 2. Then a 1 /2 a 2 /2, and hence f(a 1 ) f(a 2 ). This shows that f is an injective function. Next, let n N be arbitrary. Then 2n is an even integer, which is to say 2n A and hence 2n is in the domain of f. Now, f(2n) = 2n/2 = n, and since n N is arbitrary, we see that for any n N there exists some a A such that f(a) = n. This shows that f is a surjective function. Since f : A N is both injective and surjective, it is a one-to-one correspondence. Therefore A is countably infinite. A discrete sample space is a sample space that is a discrete set, and a continuous sample space is a sample space that is a continuous set. Example Consider the experiment E 1 consisting of tossing a coin repeatedly until heads is obtained. The first toss could result in heads (outcome: H), or the first toss tails and the second toss heads (outcome: T H), or the first two tosses tails and the third toss heads (outcome: T T H), and so on. The sample space for E 1 is Ω 1 = {H, T H, T T H, T T T H,...}. This is a countably infinite set, for we can construct a one-to-one correspondence with N by pairing 1 with H, 2 with T H, 3 with T T H, and in general n N with the outcome ω Ω consisting of n tosses of the coin (n 1 tails and 1 heads). Therefore Ω 1 is a discrete sample space. Now consider the experiment E 2 consisting of recording the exact time it takes, in seconds, until heads is obtained when repeatedly tossing a coin. Of course any measuring device such as a stopwatch will give an approximate time (the nearest millisecond perhaps), but in fact the actual time it takes to get heads the first time could be any positive real number. The sample space for E 2 is thus Ω 2 = (, ), and since any interval of real numbers is a continuous set, Ω 2 is a continuous sample space. The collection of all subsets of a set A is denoted by P(A) and called the power set of A. In particular if Ω is the sample space of an experiment, then its power set P(Ω) is the collection of all possible events that could occur when the experiment is performed. Clearly E P(Ω) if and only if E Ω. A sequence of events is a collection {E 1, E 2, E 3,...} P(Ω) that is ordered in accordance with the subscripts: event E 1 before E 2, E 2 before E 3, and so on. Such a sequence we denote by (E n ) n=1. Similarly, (E n ) N n=1 for N N denotes a finite sequence of events, with E 1 before E 2, E 2 before E 3, and so on until E N, which is the last event. A sequence (E n ) n=1 or finite sequence (E n ) N n=1 is mutually exclusive if E i E j = whenever i j. 19

22 In what follows it will be convenient to introduce some new notation. Given any sequence of sets (A n ) n=1 and N N, we define N A n = A 1 A 2 A N = {x : x A n for some 1 n N}, and n=1 A n = {x : x A n for some n 1}. n=1 If the sets being united are mutually disjoint, then will be used instead of. Also, given sets A and B, we define the set difference of A and B to be the set A B = {x : x A and x / B}. In particular, given a sample space Ω and E Ω, we define the event E c = Ω E to be the complement of the event E (relative to Ω). Definition Let S be a set. A σ-algebra on S is a collection Σ P(S) with the following properties. 1. S Σ. 2. If A Σ, then S A Σ. 3. If A n Σ for n N, then n=1 A n Σ. Since S S =, the first two properties above immediately imply that is a member of any σ-algebra. Also, for any set S it can be shown that P(S) itself is a σ-algebra on S. Another σ-algebra on S (in fact the smallest one possible) is the collection {, S}. Example Let S = {1, 2, 3}. What is the smallest σ-algebra on S that contains {1}? Let Σ be this σ-algebra. Then {1} Σ by requirement, and, S Σ by the first two properties in Definition Moreover, {1} Σ implies that S {1} = {2, 3} is in Σ also. It is straightforward to check that the collection {, {1}, {2, 3}, S}, is a σ-algebra on S that contains {1}, and since none of the members, {2, 3}, or S can be removed from the collection without violating one of the three properties in Definition 4.17, it follows that Σ = {, {1}, {2, 3}, S} is the smallest σ-algebra on S that contains {1}. Given any σ-algebra Σ on a sample space Ω, we say a sequence of events (E n ) n=1 is a sequence in Σ if E n Σ for all n N. Definition Let Ω be a sample space, and let Σ P(Ω) be a σ-algebra on Ω. A function P : Σ R is a probability measure on Σ if it satisfies the following axioms: A1. P(E) for every E Σ. A2. P(Ω) = 1. 2

23 A3. If (E n ) n=1 is a sequence of mutually exclusive events in Σ, then ( P E n )= P(E n ). n=1 If E Σ, then E is a measurable event, and the real number P(E) is the probability of the event E. It may be wondered why the notion of a σ-algebra is involved in the definition of a probability measure. If Σ in the definition is not equal to P(Ω), then that means there are sets E Ω for which P(E) is undefined. That is, there exist events in the sample space of an experiment that have no probability! How can this be? In fact, if Ω in Definition 4.19 is specifically a discrete sample space, then usually Σ is taken to be P(Ω) as expected. Difficulties arise, however, in cases when Ω is an uncountable sample space, including the common scenario in which Ω is an interval of real numbers. It can be shown, for instance, that the only probability measure P that can be defined on P(R) is given by P(R) = 1 and P(E) = for all E R such that E R. This is not a very useful probability measure! Thus, if we wish to develop a theory of probability that has any utility in a world full of continuous sample spaces, we must relinquish the hope of assigning probability values to every conceivable event. The area of mathematics most concerned with developing such a theory of probability is known as measure theory, and it is beyond the scope of this book. Proposition 4.2. Let Ω be a sample space and P a probability measure on a σ-algebra Σ on Ω. Then the following hold. 1. P( ) =. 2. If (E n ) N n=1 is a finite sequence of mutually exclusive events in Σ, then ( N N P E n )= P(E n ). n=1 n=1 n=1 3. If E 1, E 2 Σ are such that E 1 E 2, then P(E 1 ) P(E 2 ). 4. P(E) 1 for every E Σ. 5. If E 1, E 2 Σ, then Thus if E 2 E 1 also holds, then 6. If E Σ, then P(E c ) = 1 P(E). P(E 1 E 2 ) = P(E 1 ) P(E 1 E 2 ). (4.1) P(E 1 E 2 ) = P(E 1 ) P(E 2 ). (4.2) Proof. Proof of (1). Define the sequence (E n ) n=1 with E 1 = Ω and E n = for n 2. This sequence is mutually exclusive with Ω = n=1 E n. Now by Axiom A3, ( P(Ω) = P E n )= P(E n ) = P(Ω) + P( ), n=1 n=1 n=2 21

24 22 and thus P( ) =. n=2 Since P( ) by Axiom A1, it follows that P( ) =. Proof of (2). Let (E n ) N n=1 be a finite sequence of mutually exclusive events in Σ. Extend this sequence to an infinite sequence (E n ) n=1 of mutually exclusive events by defining E n = for n N + 1. Then N E n = E n, and P(E n ) = P( ) = for n N + 1 by part (1). By Axiom A3, ( N ( N P E n )= P E n )= P(E n ) = P(E n ) + n=1 n=1 n=1 n=1 n=1 n=1 n=n+1 P(E n ) = N P(E n ). Proof of (3). Suppose E 1, E 2 Σ are such that E 1 E 2. Then E 2 = E 1 (E 2 E 1 ), and since E 1 (E 2 E 1 ) =, part (2) implies that P(E 2 ) = P ( E 1 (E 2 E 1 ) ) = P(E 1 ) + P(E 2 E 1 ). Now, P(E 2 E 1 ) by Axiom A1, so that P(E 1 ) = P(E 2 ) P(E 2 E 1 ) P(E 2 ). n=1 Proof of (4). Suppose E Σ. Then P(E) P(Ω) = 1 by part (3) and Axiom A2. Proof of (5). Suppose E 1, E 2 Σ. Then P(E 1 ) = P ( E 1 (E 2 E c 2) ) = P ( (E 1 E 2 ) (E 1 E c 2) ). Since E 1 E 2 and E 1 E c 2 are mutually exclusive events, part (2) implies that P(E 1 ) = P(E 1 E 2 ) + P(E 1 E c 2). It is straightforward to check that E 1 E c 2 = E 1 E 2, and so (4.1) follows. We obtain (4.2) from (4.1) by noting that E 1 E 2 = E 2 if E 2 E 1. Proof of (6). We have E and E c = Ω E as mutually exclusive events in Σ such that E E c = Ω, and so Axiom A2 and part (2) imply that 1 = P(Ω) = P(E E c ) = P(E) + P(E c ). The conclusion that P(E c ) = 1 P(E) readily follows.

25 Theorem Let Ω be a discrete sample space, and suppose P is a probability measure on P(Ω). If E Ω with E, then P(E) = ω E P({ω}). 23 Proof. Suppose E Ω, with E not equalling the empty set. Then E is discrete since Ω is discrete, and so either E = {ω 1,..., ω N } for some N N, or E = {ω 1, ω 2,...}. Suppose E = {ω 1,..., ω N }. For each 1 n N define E n = {ω n }, so E = E 1 E N. Since ω i ω j whenever i j, we find that E i E j = whenever i j, and hence (E n ) N n=1 is a finite sequence of mutually exclusive events in Ω. Now, by Proposition 4.2(2), ( N N N P(E) = P E n )= P(E n ) = P({ω}), n=1 n=1 n=1 P({ω n }) = ω E as desired. The proof in the case when E = {ω 1, ω 2,...} is nearly identical, only N is replaced with, and Axiom A3 is used. Given a finite set A, let n(a) denote the number of elements in A. For A = {1, 2, 3} we have n(a) = 3, and for the empty set we have n( ) =. Theorem Let Ω be a sample space such that n(ω) = n 1, and suppose P is a probability measure on P(Ω). If P({ω}) = 1/n for each ω Ω, then for any E Ω. P(E) = n(e) n(ω) Proof. Suppose P({ω}) = 1/n for each ω Ω, and let E Ω. The sample space Ω is discrete since it is finite. Assuming E, Theorem 4.21 implies that P(E) = ω E P({ω}) = ω E 1 n = n(e) n = n(e) n(ω), thereby affirming (4.3). In the case when E = Proposition 4.2(1) implies that P(E) =. But we also have n(e) =, and so (4.3) obtains once more. According to Theorem 4.22, if an experiment has a finite sample space Ω, and each outcome in Ω is equally likely to occur, then the probability of an event E Ω is given by (4.3). Such a probability P(E) is sometimes referred to as the classical probability of E. Example Let E be the experiment consisting of a single roll of a single eight-sided die having sides numbered from 1 to 8. Find the probability of rolling a perfect square. Solution. The sample space of the experiment is Ω = {1, 2, 3, 4, 5, 6, 7, 8}, which is finite. We assume the die to be balanced, so that no one number is more likely to be rolled than another. (4.3)

26 Let E be the event of rolling a perfect square. There are only two perfect squares on the eight-sided die: 1 (since 1 = 1 2 ) and 4 (since 4 = 2 2 ), and thus E = {1, 4}. By Theorem 4.22, the probability of E is P(E) = n(e) n(ω) = 2 8 = 1 4. This is to say there is one chance in four that the event E will occur. Example Let E be the experiment of drawing two cards in succession from a standard deck of playing cards, without replacement. Find the probability of getting a king on the first draw, and a card with an even number on the second draw. Solution. Let A be the event of getting a king on the 1st draw and any card on the 2nd draw, and let B be the event of drawing any card on the 1st draw and an even-numbered card on the 2nd draw. The elements of these sets may be presented as ordered pairs (X, Y ), where X denotes the first card drawn and Y the second card drawn. Hence if K denotes a king and E an even-numbered card, then A has elements of the form (K, Y ), and B has elements of the form (X, E). It follows that the elements of the set A B are pairs form (K, E), which is to say A B consists precisely of those outcomes in the sample space Ω of the experiment E wherein a king is drawn first and an even-numbered card second. Thus we seek the probability of the event A B. To find P(A B) we need n(a B) and n(ω). How many elements are there in the set A B? This is simply to ask how many ordered pairs of the form (K, E) there are. There are 4 choices for K: it could be a king of hearts, diamonds, clubs, or spades; meanwhile there are 2 choices for E: it could be a 2, 4, 6, 8, or 1 in any one of the four suits. Therefore by the Fundamental Counting Principle there are (4)(2) = 8 possible pairs of the form (K, E), and so n(a B) = 8. Now for n(ω), the total number of possible outcomes of the experiment E. Elements of the set Ω are pairs of the form (X, Y ), with X being any one of the 52 cards in the deck before the first card is drawn, and Y being any one of the remaining 51 cards available after the first card is drawn. Thus n(ω) = (52)(51) = Finally, by Theorem 4.22 we find that our answer. P(A B) = n(a B) n(ω) = = , Some further properties of a probability measure P that will be useful in Chapter 6 follow presently. First we introduce some new notation: given a sequence of sets (A n ) n=1, we write A n A if A = n=1 A n and A n A n+1 for all n, and we write A n A if A = n=1 A n and A n A n+1 for all n. Proposition Let Ω be a sample space, P a probability measure on a σ-algebra Σ on Ω, and (E n ) n=1 a sequence in Σ. If E n E or E n E, then lim P(E n) = P(E). n 24

27 Proof. Suppose E n E, so E n E n+1 for each n N, and E = n=1 E n. If we define E =, then (E n E n 1 ) n=1 is a mutually exclusive sequence in Σ such that E = E n = (E n E n 1 ). n=1 n=1 By Axiom A3 followed by Proposition 4.2(5), ( ) P(E) = P (E n E n 1 ) = P(E n E n 1 ) = = lim n n=1 n k=1 n=1 [ P(En ) P(E n 1 ) ] n=1 [ P(Ek ) P(E k 1 ) ] = lim n [ P(En ) P(E ) ] = lim n P(E n ) as desired, where P(E ) = P( ) = by Proposition 4.2(1). Now suppose E n E, so E n E n+1 for each n N, and E = n=1 E n. Then En c En+1 c and E c = n=1 Ec n, which is to say En c E c, and so by our previous finding we have lim n P(Ec n) = P(E c ). By Proposition 4.2(6) it follows that which immediately yields as desired. lim [1 P(E n)] = 1 P(E), n lim P(E n) = P(E) n 25

28 Conditional Probability The idea that the occurrence of one event may affect the chances of another event occurring is a natural one which motives the following definition. Definition Let P be a probability measure on a σ-algebra Σ. If A, B Σ such that P(B) >, then the conditional probability of A given B is P(A B) = P(A B). (4.4) P(B) The symbol P(A B) may be read as the probability of event A given the occurrence of event B, and while seems to connote the existence of an external timeline in which B occurs before A, it is not necessarily so. The value P(A B) is sometimes called the joint probability of A and B, which is the probability of events A and B both occurring (though not necessarily simultaneously in time). Definition Two events A and B are independent if P(A B) = P(A)P(B). Proposition Let A and B be events such that P(B) >. Then A and B are independent if and only if P(A B) = P(A). Proof. Suppose A and B are independent. Then by Definition 4.26, P(A B) = P(A B) P(B) = P(A)P(B) P(B) Now suppose that P(A B) = P(A). Then by Definition 4.26, = P(A). finishing the proof. P(A B) = P(A B)P(B) = P(A)P(B), If it is given that P(A B) = P(A), then by Definition 4.26 it must necessarily be the case that P(B) >, and so A and B must be independent by Proposition The same conclusion follows if P(B A) = P(B). Example Let E be the experiment of Example 4.24: two cards are drawn in succession from a standard deck of playing cards, without replacement. Use Definition 4.26 to find the probability of getting an even-numbered card on the second draw, given that a king was taken on the first draw. Solution. As in Example 4.24, let A be the event of getting a king on the 1st draw and any card on the 2nd draw, and let B be the event of drawing any card on the 1st draw and an even-numbered card on the 2nd draw. We must determine P(A), which in turn requires finding n(a). The set A consists precisely of pairs of the form (K, Y ), where K is any king taken on the 1st draw (4 choices), and Y is any one of the cards remaining in the deck after the 1st draw (51

29 choices). The Fundamental Counting Principle thus implies that n(a) = (4)(51) = 24, whereas we found that n(ω) = 2652 in Example By Theorem 4.22, P(A) = n(a) n(ω) = = In Example 4.24 we obtained P(A B) = 2, and so by Definition 4.26 we have 663 P(B A) = P(A B) P(A) 27 = 2/663 1/13 = Remark. There is, to be sure, a quicker way to figure out the probability of getting an evennumbered card on the second draw, given that a king was taken on the first draw. There are 51 cards remaining in the deck after a king is taken out, and 2 of those 51 cards are even-numbered. This immediately implies that the answer is 2, and so we see that the curious formula for 51 determining a conditional probability given by Definition 4.26 agrees with our intuition.

30 The Multiplicative and Additive Laws Theorem 4.3 (Additive Law). Let P be a probability measure on a σ-algebra Σ. If A, B Σ, then P(A B) = P(A) + P(B) P(A B). (4.5) Proof. By Proposition 4.2(5) we have Also, since P ( (A B) (A B) ) = P(A B) P(A B). (4.6) (A B) (A B) = (A B) (B A) for mutually exclusive measurable events A B and B A, Proposition 4.2 parts (2) and (5) give P ( (A B) (A B) ) = P ( (A B) (B A) ) = P(A B) + P(B A) = [ P(A) P(A B) ] + [ P(B) P(A B) ] = P(A) + P(B) 2P(A B). This result combined with (4.6) gives (4.5), finishing the proof. The Additive Law as stated above involves just two events A and B; however, the law can be extended to involve three events or more. The three-event version we now state, as it evidences a pattern that in fact holds in the general case. Proposition Let P be a probability measure on a σ-algebra Σ. If A, B, C Σ, then P(A B C) = P(A) + P(B) + P(C) P(A B) P(A C) P(B C) + P(A B C). Proof. For A, B, C Σ, let D = B C. By the Additive Law, and also Now, P(D) = P(B C) = P(B) + P(C) P(B C), P(A D) = P ( A (B C) ) = P ( (A B) (A C) ) = P(A B) + P(A C) P ( (A B) (A C) ). P(A B C) = P(A D) = P(A) + P(D) P(A D) = P(A) + [ P(B) + P(C) P(B C) ] [ P(A B) + P(A C) P ( (A B) (A C) )],

31 29 which with regrouping gives the desired outcome. Example Let E be the experiment of drawing two cards in succession from a standard deck of playing cards, without replacement. Find the probability of getting a king on the first draw, and a card in the suit of clubs on the second draw. Solution. Let A be the event of getting a king on the 1st draw and any card on the 2nd draw, and let B be the event of drawing any card on the 1st draw and a club on the 2nd draw. Then the event of a king on the 1st draw and a club on the 2nd draw is A B. If a king is taken on the 1st draw, the probability of a club on the 2nd draw will depend on whether it was K that was taken, or a king in some other suit. If we let A 1 be the event of getting a K on the 1st draw and any card on the 2nd draw, and A 2 the event of getting a K, K, or K on the 1st draw and any card on the 2nd draw, then A = A 1 A 2 with A 1 A 2 =. It follows that so (A 1 B) (A 2 B) = (A 1 A 2 ) B = B =, P ( (A 1 B) (A 2 B) ) = P( ) =, and then by the Additive and Multiplicative Laws, P(A B) = P ( (A 1 A 2 ) B ) = P ( (A 1 B) (A 2 B) ) = P(A 1 B) + P(A 2 B) P ( (A 1 B) (A 2 B) ) = P(A 1 B) + P(A 2 B) = P(A 1 )P(B A 1 ) + P(A 2 )P(B A 2 ). The final probabilities may be found in the same way the probabilities in Example 4.29 were found; however, as observed in the remark following that example, the conditional probabilities may be found in a quicker fashion (at least in this relatively simple setting). First, we have P(A 1 ) = 1 and P(A 52 2) = 3, since there is but one K in a deck of 52 cards, and 3 kings in 52 other suits. As for P(B A 1 ), if a K is drawn first, then there are 12 cards in the suit of clubs remaining among the 51 cards left in the deck, and so P(B A 1 ) = 12. If a K, K, or K is 51 drawn first, then 13 clubs remain in the deck of 51, and so P(B A 2 ) = 13. Therefore, 51 P(A B) = P(A 1 )P(B A 1 ) + P(A 2 )P(B A 2 ) = = 1 52, our answer.

Module 1. Probability

Module 1. Probability Module 1 Probability 1. Introduction In our daily life we come across many processes whose nature cannot be predicted in advance. Such processes are referred to as random processes. The only way to derive

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then 1.1 Probabilities Def n: A random experiment is a process that, when performed, results in one and only one of many observations (or outcomes). The sample space S is the set of all elementary outcomes

More information

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes. Chapter 2 Introduction to Probability 2.1 Probability Model Probability concerns about the chance of observing certain outcome resulting from an experiment. However, since chance is an abstraction of something

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Probability Year 10. Terminology

Probability Year 10. Terminology Probability Year 10 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

Probability Theory Review

Probability Theory Review Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Name: Exam 2 Solutions. March 13, 2017

Name: Exam 2 Solutions. March 13, 2017 Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth

More information

Section F Ratio and proportion

Section F Ratio and proportion Section F Ratio and proportion Ratio is a way of comparing two or more groups. For example, if something is split in a ratio 3 : 5 there are three parts of the first thing to every five parts of the second

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

MIDTERM EXAMINATION (Spring 2011) STA301- Statistics and Probability

MIDTERM EXAMINATION (Spring 2011) STA301- Statistics and Probability STA301- Statistics and Probability Solved MCQS From Midterm Papers March 19,2012 MC100401285 Moaaz.pk@gmail.com Mc100401285@gmail.com PSMD01 MIDTERM EXAMINATION (Spring 2011) STA301- Statistics and Probability

More information

Notes Week 2 Chapter 3 Probability WEEK 2 page 1

Notes Week 2 Chapter 3 Probability WEEK 2 page 1 Notes Week 2 Chapter 3 Probability WEEK 2 page 1 The sample space of an experiment, sometimes denoted S or in probability theory, is the set that consists of all possible elementary outcomes of that experiment

More information

Statistical Inference

Statistical Inference Statistical Inference Lecture 1: Probability Theory MING GAO DASE @ ECNU (for course related communications) mgao@dase.ecnu.edu.cn Sep. 11, 2018 Outline Introduction Set Theory Basics of Probability Theory

More information

3 PROBABILITY TOPICS

3 PROBABILITY TOPICS Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics: Executive Assessment Math Review Although the following provides a review of some of the mathematical concepts of arithmetic and algebra, it is not intended to be a textbook. You should use this chapter

More information

2) There should be uncertainty as to which outcome will occur before the procedure takes place.

2) There should be uncertainty as to which outcome will occur before the procedure takes place. robability Numbers For many statisticians the concept of the probability that an event occurs is ultimately rooted in the interpretation of an event as an outcome of an experiment, others would interpret

More information

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events... Probability COMP 245 STATISTICS Dr N A Heard Contents Sample Spaces and Events. Sample Spaces........................................2 Events........................................... 2.3 Combinations

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Introduction to Statistics

Introduction to Statistics Introduction to Statistics Data and Statistics Data consists of information coming from observations, counts, measurements, or responses. Statistics is the science of collecting, organizing, analyzing,

More information

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER 2 2017/2018 DR. ANTHONY BROWN 5.1. Introduction to Probability. 5. Probability You are probably familiar with the elementary

More information

STA Module 4 Probability Concepts. Rev.F08 1

STA Module 4 Probability Concepts. Rev.F08 1 STA 2023 Module 4 Probability Concepts Rev.F08 1 Learning Objectives Upon completing this module, you should be able to: 1. Compute probabilities for experiments having equally likely outcomes. 2. Interpret

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

MATH2206 Prob Stat/20.Jan Weekly Review 1-2

MATH2206 Prob Stat/20.Jan Weekly Review 1-2 MATH2206 Prob Stat/20.Jan.2017 Weekly Review 1-2 This week I explained the idea behind the formula of the well-known statistic standard deviation so that it is clear now why it is a measure of dispersion

More information

Review Basic Probability Concept

Review Basic Probability Concept Economic Risk and Decision Analysis for Oil and Gas Industry CE81.9008 School of Engineering and Technology Asian Institute of Technology January Semester Presented by Dr. Thitisak Boonpramote Department

More information

Probability the chance that an uncertain event will occur (always between 0 and 1)

Probability the chance that an uncertain event will occur (always between 0 and 1) Quantitative Methods 2013 1 Probability as a Numerical Measure of the Likelihood of Occurrence Probability the chance that an uncertain event will occur (always between 0 and 1) Increasing Likelihood of

More information

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Managers Using Microsoft Excel (3 rd Edition) Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts

More information

MATH MW Elementary Probability Course Notes Part I: Models and Counting

MATH MW Elementary Probability Course Notes Part I: Models and Counting MATH 2030 3.00MW Elementary Probability Course Notes Part I: Models and Counting Tom Salisbury salt@yorku.ca York University Winter 2010 Introduction [Jan 5] Probability: the mathematics used for Statistics

More information

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek

More information

Keystone Exams: Algebra

Keystone Exams: Algebra KeystoneExams:Algebra TheKeystoneGlossaryincludestermsanddefinitionsassociatedwiththeKeystoneAssessmentAnchorsand Eligible Content. The terms and definitions included in the glossary are intended to assist

More information

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space I. Vocabulary: A. Outcomes: the things that can happen in a probability experiment B. Sample Space (S): all possible outcomes C. Event (E): one outcome D. Probability of an Event (P(E)): the likelihood

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Probability (Devore Chapter Two)

Probability (Devore Chapter Two) Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Fall 2012 Contents 0 Administrata 2 0.1 Outline....................................... 3 1 Axiomatic Probability 3

More information

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Lecture 3. Probability and elements of combinatorics

Lecture 3. Probability and elements of combinatorics Introduction to theory of probability and statistics Lecture 3. Probability and elements of combinatorics prof. dr hab.inż. Katarzyna Zakrzewska Katedra Elektroniki, AGH e-mail: zak@agh.edu.pl http://home.agh.edu.pl/~zak

More information

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD .0 Introduction: The theory of probability has its origin in the games of chance related to gambling such as tossing of a coin, throwing of a die, drawing cards from a pack of cards etc. Jerame Cardon,

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

3.2 Probability Rules

3.2 Probability Rules 3.2 Probability Rules The idea of probability rests on the fact that chance behavior is predictable in the long run. In the last section, we used simulation to imitate chance behavior. Do we always need

More information

Week 2: Probability: Counting, Sets, and Bayes

Week 2: Probability: Counting, Sets, and Bayes Statistical Methods APPM 4570/5570, STAT 4000/5000 21 Probability Introduction to EDA Week 2: Probability: Counting, Sets, and Bayes Random variable Random variable is a measurable quantity whose outcome

More information

MATH STUDENT BOOK. 12th Grade Unit 9

MATH STUDENT BOOK. 12th Grade Unit 9 MATH STUDENT BOOK 12th Grade Unit 9 Unit 9 COUNTING PRINCIPLES MATH 1209 COUNTING PRINCIPLES INTRODUCTION 1. PROBABILITY DEFINITIONS, SAMPLE SPACES, AND PROBABILITY ADDITION OF PROBABILITIES 11 MULTIPLICATION

More information

Axioms of Probability

Axioms of Probability Sample Space (denoted by S) The set of all possible outcomes of a random experiment is called the Sample Space of the experiment, and is denoted by S. Example 1.10 If the experiment consists of tossing

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

Basic Probability. Introduction

Basic Probability. Introduction Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with

More information

CMPSCI 240: Reasoning about Uncertainty

CMPSCI 240: Reasoning about Uncertainty CMPSCI 240: Reasoning about Uncertainty Lecture 2: Sets and Events Andrew McGregor University of Massachusetts Last Compiled: January 27, 2017 Outline 1 Recap 2 Experiments and Events 3 Probabilistic Models

More information

Chapter 4. Displaying and Summarizing. Quantitative Data

Chapter 4. Displaying and Summarizing. Quantitative Data STAT 141 Introduction to Statistics Chapter 4 Displaying and Summarizing Quantitative Data Bin Zou (bzou@ualberta.ca) STAT 141 University of Alberta Winter 2015 1 / 31 4.1 Histograms 1 We divide the range

More information

Statistics 100 Exam 2 March 8, 2017

Statistics 100 Exam 2 March 8, 2017 STAT 100 EXAM 2 Spring 2017 (This page is worth 1 point. Graded on writing your name and net id clearly and circling section.) PRINT NAME (Last name) (First name) net ID CIRCLE SECTION please! L1 (MWF

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive

More information

Glossary for the Triola Statistics Series

Glossary for the Triola Statistics Series Glossary for the Triola Statistics Series Absolute deviation The measure of variation equal to the sum of the deviations of each value from the mean, divided by the number of values Acceptance sampling

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru Venkateswara Rao, Ph.D. STA 2023 Fall 2016 Venkat Mu ALL THE CONTENT IN THESE SOLUTIONS PRESENTED IN BLUE AND BLACK

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Topic 3: Introduction to Probability

Topic 3: Introduction to Probability Topic 3: Introduction to Probability 1 Contents 1. Introduction 2. Simple Definitions 3. Types of Probability 4. Theorems of Probability 5. Probabilities under conditions of statistically independent events

More information

Discrete Probability. Chemistry & Physics. Medicine

Discrete Probability. Chemistry & Physics. Medicine Discrete Probability The existence of gambling for many centuries is evidence of long-running interest in probability. But a good understanding of probability transcends mere gambling. The mathematics

More information

Chapter 1 The Real Numbers

Chapter 1 The Real Numbers Chapter 1 The Real Numbers In a beginning course in calculus, the emphasis is on introducing the techniques of the subject;i.e., differentiation and integration and their applications. An advanced calculus

More information

1 Basic Combinatorics

1 Basic Combinatorics 1 Basic Combinatorics 1.1 Sets and sequences Sets. A set is an unordered collection of distinct objects. The objects are called elements of the set. We use braces to denote a set, for example, the set

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

Introduction and basic definitions

Introduction and basic definitions Chapter 1 Introduction and basic definitions 1.1 Sample space, events, elementary probability Exercise 1.1 Prove that P( ) = 0. Solution of Exercise 1.1 : Events S (where S is the sample space) and are

More information

Sets and Set notation. Algebra 2 Unit 8 Notes

Sets and Set notation. Algebra 2 Unit 8 Notes Sets and Set notation Section 11-2 Probability Experimental Probability experimental probability of an event: Theoretical Probability number of time the event occurs P(event) = number of trials Sample

More information

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B). Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,

More information

PROBABILITY THEORY 1. Basics

PROBABILITY THEORY 1. Basics PROILITY THEORY. asics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying patterns about them. The notion

More information

Topic 5 Basics of Probability

Topic 5 Basics of Probability Topic 5 Basics of Probability Equally Likely Outcomes and the Axioms of Probability 1 / 13 Outline Equally Likely Outcomes Axioms of Probability Consequences of the Axioms 2 / 13 Introduction A probability

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

1. Discrete Distributions

1. Discrete Distributions Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 1. Discrete Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space Ω.

More information

Discrete Random Variables (1) Solutions

Discrete Random Variables (1) Solutions STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 06 Néhémy Lim Discrete Random Variables ( Solutions Problem. The probability mass function p X of some discrete real-valued random variable X is given

More information

Lecture Slides. Elementary Statistics Tenth Edition. by Mario F. Triola. and the Triola Statistics Series. Slide 1

Lecture Slides. Elementary Statistics Tenth Edition. by Mario F. Triola. and the Triola Statistics Series. Slide 1 Lecture Slides Elementary Statistics Tenth Edition and the Triola Statistics Series by Mario F. Triola Slide 1 4-1 Overview 4-2 Fundamentals 4-3 Addition Rule Chapter 4 Probability 4-4 Multiplication Rule:

More information

Probability & Random Variables

Probability & Random Variables & Random Variables Probability Probability theory is the branch of math that deals with random events, processes, and variables What does randomness mean to you? How would you define probability in your

More information

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School

Basic counting techniques. Periklis A. Papakonstantinou Rutgers Business School Basic counting techniques Periklis A. Papakonstantinou Rutgers Business School i LECTURE NOTES IN Elementary counting methods Periklis A. Papakonstantinou MSIS, Rutgers Business School ALL RIGHTS RESERVED

More information

CHAPTER - 16 PROBABILITY Random Experiment : If an experiment has more than one possible out come and it is not possible to predict the outcome in advance then experiment is called random experiment. Sample

More information

Lecture Lecture 5

Lecture Lecture 5 Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled

More information

MATH 556: PROBABILITY PRIMER

MATH 556: PROBABILITY PRIMER MATH 6: PROBABILITY PRIMER 1 DEFINITIONS, TERMINOLOGY, NOTATION 1.1 EVENTS AND THE SAMPLE SPACE Definition 1.1 An experiment is a one-off or repeatable process or procedure for which (a there is a well-defined

More information

Lecture notes for probability. Math 124

Lecture notes for probability. Math 124 Lecture notes for probability Math 124 What is probability? Probabilities are ratios, expressed as fractions, decimals, or percents, determined by considering results or outcomes of experiments whose result

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Basic Concepts of Probability

Basic Concepts of Probability Probability Probability theory is the branch of math that deals with unpredictable or random events Probability is used to describe how likely a particular outcome is in a random event the probability

More information

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample Lecture 2: Probability Readings: Sections 5.1-5.3 1 Introduction Statistical Inference: drawing conclusions about the population based on a sample Parameter: a number that describes the population a fixed

More information

In this initial chapter, you will be introduced to, or more than likely be reminded of, a

In this initial chapter, you will be introduced to, or more than likely be reminded of, a 1 Sets In this initial chapter, you will be introduced to, or more than likely be reminded of, a fundamental idea that occurs throughout mathematics: sets. Indeed, a set is an object from which every mathematical

More information

Outcomes, events, and probability

Outcomes, events, and probability 2 Outcomes, events, and probability The world around us is full of phenomena we perceive as random or unpredictable We aim to model these phenomena as outcomes of some experiment, where you should think

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information

Statistics Primer. A Brief Overview of Basic Statistical and Probability Principles. Essential Statistics for Data Analysts Using Excel

Statistics Primer. A Brief Overview of Basic Statistical and Probability Principles. Essential Statistics for Data Analysts Using Excel Statistics Primer A Brief Overview of Basic Statistical and Probability Principles Liberty J. Munson, PhD 9/19/16 Essential Statistics for Data Analysts Using Excel Table of Contents What is a Variable?...

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences

More information

Principles of Real Analysis I Fall I. The Real Number System

Principles of Real Analysis I Fall I. The Real Number System 21-355 Principles of Real Analysis I Fall 2004 I. The Real Number System The main goal of this course is to develop the theory of real-valued functions of one real variable in a systematic and rigorous

More information

Unit 4 Probability. Dr Mahmoud Alhussami

Unit 4 Probability. Dr Mahmoud Alhussami Unit 4 Probability Dr Mahmoud Alhussami Probability Probability theory developed from the study of games of chance like dice and cards. A process like flipping a coin, rolling a die or drawing a card from

More information

Lecture Slides. Elementary Statistics Eleventh Edition. by Mario F. Triola. and the Triola Statistics Series 4.1-1

Lecture Slides. Elementary Statistics Eleventh Edition. by Mario F. Triola. and the Triola Statistics Series 4.1-1 Lecture Slides Elementary Statistics Eleventh Edition and the Triola Statistics Series by Mario F. Triola 4.1-1 4-1 Review and Preview Chapter 4 Probability 4-2 Basic Concepts of Probability 4-3 Addition

More information

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space? Math 166 Exam 1 Review Sections L.1-L.2, 1.1-1.7 Note: This review is more heavily weighted on the new material this week: Sections 1.5-1.7. For more practice problems on previous material, take a look

More information

UNIT 5 ~ Probability: What Are the Chances? 1

UNIT 5 ~ Probability: What Are the Chances? 1 UNIT 5 ~ Probability: What Are the Chances? 1 6.1: Simulation Simulation: The of chance behavior, based on a that accurately reflects the phenomenon under consideration. (ex 1) Suppose we are interested

More information

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0? MATH 382 Conditional Probability Dr. Neal, WKU We now shall consider probabilities of events that are restricted within a subset that is smaller than the entire sample space Ω. For example, let Ω be the

More information

1 of 7 8/11/2014 10:27 AM Units: Teacher: PacedAlgebraPartA, CORE Course: PacedAlgebraPartA Year: 2012-13 The Language of Algebra What is the difference between an algebraic expression and an algebraic

More information

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

14 - PROBABILITY Page 1 ( Answers at the end of all questions ) - PROBABILITY Page ( ) Three houses are available in a locality. Three persons apply for the houses. Each applies for one house without consulting others. The probability that all the three apply for the

More information

Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory

Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory The Fall 2012 Stat 225 T.A.s September 7, 2012 The material in this handout is intended to cover general set theory topics. Information includes (but

More information

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ).

Connectedness. Proposition 2.2. The following are equivalent for a topological space (X, T ). Connectedness 1 Motivation Connectedness is the sort of topological property that students love. Its definition is intuitive and easy to understand, and it is a powerful tool in proofs of well-known results.

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

JUMPMath. Manitoba Curriculum Correlations. Contents

JUMPMath. Manitoba Curriculum Correlations. Contents Manitoba Curriculum Correlations JUMPMath Contents Grade 1 JUMP Math Manitoba Curriculum Correlation 1-1 Grade 2 JUMP Math Manitoba Curriculum Correlation 2-1 Grade 3 JUMP Math Manitoba Curriculum Correlation

More information

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability Chapter 2 Conditional Probability and Independence 2.1 Conditional Probability Example: Two dice are tossed. What is the probability that the sum is 8? This is an easy exercise: we have a sample space

More information

Section 3 2 Probability Genetics Answers

Section 3 2 Probability Genetics Answers We have made it easy for you to find a PDF Ebooks without any digging. And by having access to our ebooks online or by storing it on your computer, you have convenient answers with section 3 2 probability

More information

Chapter 8 Sequences, Series, and Probability

Chapter 8 Sequences, Series, and Probability Chapter 8 Sequences, Series, and Probability Overview 8.1 Sequences and Series 8.2 Arithmetic Sequences and Partial Sums 8.3 Geometric Sequences and Partial Sums 8.5 The Binomial Theorem 8.6 Counting Principles

More information