CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS

Similar documents
What is Probability? Probability. Sample Spaces and Events. Simple Event

2011 Pearson Education, Inc

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

Math 1313 Experiments, Events and Sample Spaces

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Intermediate Math Circles November 8, 2017 Probability II

Probabilistic models

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Introduction to Probability Theory

Discussion 01. b) What is the probability that the letter selected is a vowel?

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Chapter 6. Probability

4. Probability of an event A for equally likely outcomes:

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

Probability 5-4 The Multiplication Rules and Conditional Probability

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Session 1: An introduction to probability p. 1

Chapter 2 PROBABILITY SAMPLE SPACE

Lecture Lecture 5

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Probability and Statistics Notes

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Probability the chance that an uncertain event will occur (always between 0 and 1)

UNIT 5 ~ Probability: What Are the Chances? 1

Chapter 2: Probability Part 1

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013

Event A: at least one tail observed A:

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Probabilistic models

Statistical Theory 1

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

PROBABILITY.

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Lecture 3 Probability Basics

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Introduction to Probability, Fall 2009

First Digit Tally Marks Final Count

Chapter. Probability

Axioms of Probability

Estadística I Exercises Chapter 4 Academic year 2015/16

Business Statistics MBA Pokhara University

Elementary Discrete Probability

F71SM STATISTICAL METHODS

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS QM 120. Chapter 4: Experiment, outcomes, and sample space

Chapter 8 Sequences, Series, and Probability

STAT 516: Basic Probability and its Applications

3.2 Probability Rules

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Discrete Distributions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

STAT Chapter 3: Probability


Formalizing Probability. Choosing the Sample Space. Probability Measures

Motivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017

Discussion 03 Solutions

Basic Concepts of Probability

Discrete Random Variables (1) Solutions

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Chapter 4 - Introduction to Probability

Probability Year 10. Terminology

2.6 Tools for Counting sample points

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

JOINT PROBABILITY DISTRIBUTIONS

Probability Year 9. Terminology

P [(E and F )] P [F ]

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

1 Random variables and distributions

Chapter 3 : Conditional Probability and Independence

Probability Theory Review

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?

Lecture notes for probability. Math 124

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS

Stochastic processes and stopping time Exercises

Discrete Random Variables

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Basic Statistics and Probability Chapter 3: Probability

Introduction to Probability

Conditional Probability

Probability & Random Variables

STAT 430/510 Probability

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University

Year 10 Mathematics Probability Practice Test 1

1 The Basic Counting Principles

Module 1. Probability

P ( A B ) = P ( A ) P ( B ) 1. The probability that a randomly selected student at Anytown College owns a bicycle is

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Transcription:

CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS 4.2 Events and Sample Space De nition 1. An experiment is the process by which an observation (or measurement) is obtained Examples 1. 1: Tossing a pair of fair coins. 2: Rolling a balanced die. 3: Recording the price of a commodity at certain type of department store. 4: Launching a missile and observing the velocity at speci ed time. 5: Observing the gender of a newborn child. 6: Drawing a card from an ordinary deck of 52 playing cards. 7: Inspecting a load of computer chips and recording the number of defective chips. 8: Determining the amount of dosage that must be given to a patient until the patient react positively. De nition 2. A simple event is the outcome that is observed on a single repetition of the experiment. De nition 3. The set of all simple events is called the sample space, denoted by S. Examples 2. Refer to Example 1, the sample spaces are: 1: S is the set that consists of four outcomes: HH, HT, T H, and T T. 2: S is the set that consists of six integers: 1, 2, 3, 4, 5, and 6. 3: S is the set that consists of all positive real numbers. 4: S is the set that consists of all positive real numbers. 5: S is the set that consists of two outcomes: boy and girl. 6: S is the set that consists of 13 spades, 13 hearts, 13 diamonds, and 13 clubs. 7: S is the set that consists of all non-negative integers 0, 1, 2, 3, 4,. 8: S is the set that consists of all positive real numbers. 1

De nition 4. An event is a collection of simple events, often denoted by a capital letter. Any event is a subset of the sample space. Example 3. Consider the experiment of tossing a single balanced die. The sample space is the set that consists of six integers 1, 2, 3, 4, 5, and 6. Now de ne the following events: A : B : Observe an odd number. Observe an even number. C : Observe a number less than 4. D : Observe a 2 or 5. Then A is the set that consists of three integers 1, 3,and 5. B is the set that consists of three integers 2, 4,and 6. C is the set that consists of three integers 1, 2,and 3. D is the set that consists of two integers 2 and 5. De nition 5. Two events A and B are said to be mutually exclusive if they have nothing in common, that is, when one event occurs, the other cannot, and vice versa. Example 4. In Example 3, events A and B are mutually exclusive, because they have nothing in common. Some experiments can be generated in stages, and the sample space can be displayed in a tree diagram. Example 5. Toss 3 fair coins. The sample space can be displayed in the following tree diagram. % H H % & T H % & % H T & T % H H & % & T T & % H T & T 2

Therefore, the sample space is given as S = fhhh; HHT; HT H; HT T; T HH; T HT; T T H; T T T g : 4.3 Calculating Probabilities Using Simple Events by If an experiment is performed n times, then the relative frequency of an event A is de ned Relative frequency of A = Frequency n where frequency is the number of times the event A occurred. Intuitively, one way of de ning probability is as follows: Clearly, we have the following properties: Properties of Probabilities: P (A) = lim n!1 Frequency n 1: P (A) must be a proportion lying between 0 and 1; 2: P (A) = 0 if the event A never occurs, and 3: P (A) = 1 if the event A always occurs. Requirements for Simple-Event Probabilities: 1: Each probability must lie between 0 and 1. 2: The sum of the probabilities for all simple events in the sample space S equals 1. De nition 6. The probability of an event A is equal to the sum of the probabilities of the simple events contained in A. One possible way of facilitating this probability infrastructure is to assume equally likely probabilities for all simple events as demonstrated in the following example. Example 6. Toss 3 fair coins. It has been shown in Example 5 that the sample space consists of eight outcomes: HHH; HHT; HT H; HT T; T HH; T HT; T T H; T T T: It is natural to assume equally likely probabilities; that is, each outcome has probability 1=8. De ne A : Observe exactly two heads. Then event A consists of three outcomes: HHT; HT H; T HH. Hence, 3

P (A) = 1 8 + 1 8 + 1 8 = 3 8 : Example 7. In a study of the loans that were made to corporate borrowers and that matures during the past year, an o cer of a commercial bank classi ed these loans into the following categories with respect to collection experience: excellent, good, fair, poor, and bad. The following table gives the percentage of each category: Collection Experience of Loans Percentages Excellent 0:48 Good 0:35 Fair 0:12 Poor 0:03 Bad 0:02 If a loan is chosen at random, what is the probability that it is better than "fair"? Solution. Although the ve simple events, excellent, good, fair, poor, and bad, do not have equally likely probabilities, their probabilities are precisely given. The event of interest consists of two simple events, "excellent" and "good", so P floan is better than "fair"g = P f"excellent"g + P f"good"g HOMEWORK: pp:134 136 = 0:48 + 0:35 = 0:83: 4:1; 4:3; 4:5; 4:9; 4:11; 4:13; 4:15 4.5 Event Relations and Probability Rules De nition 7. The union of events A and B, denoted by A [ B, is the event that either A or B or both occur. De nition 8. The intersection of events A and B, denoted by A \ B, is the event that both A and B occur. De nition 9. The complement of an event A, denoted by A c, is the event that A does not occur. Example 8. Consider the experiment of tossing 2 fair coins. Then the sample space is given by S = fhh; HT; T H; T T g : 4

De ne A : Observe at least one head. B : Observe at least one tail. Then A = fhh; HT; T Hg A [ B = fhh; HT; T H; T T g B = fht; T H; T T g A \ B = fht; T Hg A c = ft T g B c = fhhg : Therefore, P (A) = 3 4 P (B) = 3 4 P (A [ B) = 4 4 = 1 P (A \ B) = 2 4 = 1 2 P (A c ) = 1 4 P (B c ) = 1 4 : Theorem 1 (The Addition Rule). Given two events A and B, the probability of their union, A [ B, is equal to P (A [ B) = P (A) + P (B) P (A \ B) When two events A and B are mutually exclusive, then P (A \ B) = 0 and so P (A [ B) = P (A) + P (B) : The gure below illustrates the intuition of this rule. = + A [ B A B A \ B Example 9. In Example 8, it is clear that P (A [ B) = 1 and Consequently, P (A) + P (B) P (A \ B) = 3 4 + 3 4 1 2 = 1: P (A [ B) = P (A) + P (B) P (A \ B) : 5

Example 10. Refer to Example 7. Let A = fan excellent loang and B = fa good loang : Clearly, A and B are mutually exclusive. Hence, P (A [ B) = P (A) + P (B) = 0:48 + 0:35 = 0:83: Example 11. The percentage of employment of adults in a small town is given in the following table. Suppose one adult is selected at random from the town. Employed (E) Unemployed (N) Male (M) 0:44 0:10 Female (F ) 0:32 0:14 (1) What is the probability that the adult is a female? (2) What is the probability that the adult is employed? (3) What is the probability that the adult is a female or employed? Solution. (1) The probability that the adult is a female is equal to P (F ) = 0:32 + 0:14 = 0:46: (2) The probability that the adult is employed is equal to P (E) = 0:44 + 0:32 = 0:76: (3) The probability that the adult is a female or employed is equal to P (E [ F ) = P (E) + P (F ) P (E \ F ) = 0:76 + 0:46 0:32 = 0:90: Due to the facts that S = A [ B and A and B are mutually exclusive, it is easy to see that 1 = P (S) = P (A) + P (A c ) : Consequently, we have the following result. Theorem 2 (Rule for Complements). For any event A, the probability of the complement of A is equal to P (A c ) = 1 P (A) : 6

Example 12. Refer to Example 8. It has been found that Hence, P (A) = 3 4 1 P (A) = 1 and P (A c ) = 1 4 3 4 = 1 4 = P (Ac ) : Example 13. From the past experience, the probability that an automobile mechanic will service 3, 4, 5, 6, 7, 8, 9, or more cars on any given work day are 0:12, 0:19, 0:22, 0:20, 0:10, 0:07, 0:06, and 0:04, respectively. What is the probability that he will service at least 4 cars on his next day at work? Solution. Let A denote the event that at least 4 cars are serviced. Then A c is the event that fewer than 4 cars are serviced. So P (A c ) = P f3 cars are servicedg = 0:12: Hence, it follows from the Rule for Complements that P (A) = 1 P (A c ) = 1 0:12 = 0:88: Example 14. What is the probability of observing at least one head when a fair coin is tossed 10 times? of Solution. possible outcomes. De ne Then First of all, it can be seen from a tree diagram that the sample space consists 2 2 {z 2 } = 2 10 Product of 10 twos A : Observe no head. A c : Observe at least one head. Note that A is an event that consists of only one outcome (that is, tails in all 10 tosses). Thus, P (A) = 1 2 10 : Therefore, it follows from the Rule for Complements that P (A c ) = 1 P (A) = 1 1 2 10 : 7

4.6 Independence, Conditional Probability, and the Multiplication Rule Conditional Probability In a single toss of a balanced die, what is the probability of obtaining (1) a 2? (2) a 2 given the occurrence of an even number? The answer to Question (1) is obviously 1=6. In Question (2), the sample space is reduced to f2; 4; 6g so that the desired probability is 1=3. The probability described in Question (2) is referred to as the conditional probability. The probability of an event will vary depending upon the occurrence or nonoccurrence of one or more related events. In many situations, we will be given that an event B has occurred and we want to know the probability that event A occurred, conditioned on, or given the occurrence of the event B. Such a probability is called a conditional probability and is denoted by P (AjB) to distinguish it from the unconditional probability, P (A), of the occurrence of A. Example 15. A manufacturer of programmable DV D may order silicon chips from any of several di erent suppliers. He might want to know the proportion of defective chips in an arriving lot is less than 3 percent. This is an unconditional probability. Given that an arriving lot was produced by a speci c supplier, he would be interested in the probability that the proportion of defective chips is less than 3 percent. Then it is a conditional probability of A (defective proportion is less than 3 percent) given that event B has occurred (the lot was produced by that speci c supplier). De nition 10. The probability of an event A, given that event B has occurred, is called the conditional probability of A, given that B has occurred, denoted by P (AjB). The conditional probability can be used in calculating the probability that both A and B occur when the experiment is performed. Theorem 3 (The Multiplication Rule). The probability that both A and B occur when the experiment is performed is P (A \ B) = P (A) P (BjA) or P (A \ B) = P (B) P (AjB) : Example 16. A fuse box contains 20 silicon chips, of which 5 are defective. If 2 chips are chosen at random and removed from the box in succession without replacing the rst, what is the probability that both chips are defective? Solution. De ne A : The rst chip chosen is defective. B : The second chip chosen is defective. 8

Then A \ B can be interpreted as the event that both chips chosen are defective. The probability of rst chosen chip is defective is 5=20 = 1=4; then the probability of second chosen chip from the remaining is defective is 4=19. Hence, it follows from the Multiplication Rule that P (A \ B) = P (A) P (BjA) = 1 4 4 19 = 1 19 : Sometimes we may have to use the Multiplication Rule in a slightly di erent form in order to calculate the conditional probability. Just rearrange the terms in the Multiplication Rule. Theorem 4 (Conditional Probability). that event B has occurred is P (AjB) = P (A \ B) P (B) The conditional probability of event A, given provided P (B) 6= 0: The conditional probability of event B, given that event A has occurred is P (BjA) = P (A \ B) P (A) provided P (A) 6= 0: Example 17. Refer to Example 11. The proportion of female adults in the employed category is the conditional probability of an employed adult, given that the adult is a female. It is equal to P (EjF ) = P (E \ F ) P (F ) = 0:32 0:46 = 0:696: The proportion of female in the employed adults category is the conditional probability of a female, given that the adult is an employed adult. It is equal to Independence P (F je) = P (E \ F ) P (E) = 0:32 0:76 = 0:421: De nition 11 Two events A and B are said to be independent if and only if the probability of event B is not in uenced or a ected by the occurrence event A, or vice versa. Example 18. Consider an experiment of tossing a single balanced die twice. De ne A : Observe a 3 on the rst toss. B : Observe a 3 on the second toss. Since the two tosses are independent, events A and B are intuitively independent. This can be further justi ed as follows: The probability of event A is clearly equal to P (A) = 1=6. 9

Regardless of whether event A has or has not occurred, the probability of observing a 3 on the second toss is still 1=6. We may write P fb given that A occurredg = P (BjA) = 1 6 : and P fb given that A did not occurg = P (BjA c ) = 1 6 : Since the probability of event B is not changed by the occurrence of event A, we say that A and B are independent. When events A and B are independent, (1) the probability of A remains the same whether or not event B has occurred; that is, P (AjB) = P (A) ; (2) the probability of B remains the same whether or not event A has occurred; that is, P (BjA) = P (B) : Either (1) or (2) are equivalent to the following equality: P (A \ B) = P (A) P (B) : De nition 12 (Checking for Independence). Two events A and B are said to be independent if and only if any of the following equalities hold: (1) P (AjB) = P (A) ; (2) P (BjA) = P (B) ; (3) P (A \ B) = P (A) P (B) : Otherwise, the events are said to be dependent. Example 19. Consider an experiment of tossing two fair coins. De ne A : Observe a head on the rst coin. B : Observe a tail on the second coin. Are the two events A and B are independent? Solution. Clearly, the sample space is S = fhh; HT; T H; T T g ; 10

Then A \ B = fht g. Thus, A = fhh; HT g and B = fht; T T g : P (A) = 2 4 = 1 2 ; P (B) = 2 4 = 1 2 ; and P (A \ B) = 1 4 ; and so P (A \ B) = P (A) P (B) : Therefore, A and B are independent. Example 20. Refer to Example 11. Are the two events E and F are independent? Solution. Since P (E) = 0:76; P (F ) = 0:46; and P (E \ F ) = 0:32; it is clear that P (E \ F ) 6= P (E) P (F ). Hence, E and F are dependent. Alternately, we calculate that P (EjF ) = P (E \ F ) P (F ) = 0:32 0:46 = 0:696 and so P (EjF ) 6= P (E) : Therefore, E and F are not independent. HOMEWORK: pp:155 158 4:47; 4:49; 4:51; 4:52; 4:53; 4:61; 4:63; 4:65 4.8 Discrete Random Variables and Their Probability Distributions Up to now our discussion of probability theory has been concerned with situations in which the sample points of a sample space are arbitrary objects that may not be numbers. In the experiment of tossing a coin, for example, the outcome is either head or tail which is not a number. For many purposes, it is convenient to convert the outcome of an experiment into a numerical value, in particular to be able to use the familiar structure of the real numbers. Often we are not interested in the details associated with each sample point but only in some numerical description of the outcome. For instance, the sample space with a detailed description of each possible outcome of tossing a coin twice may be written 11

S = fhh; HT; T H; T T g: If we are concerned only with the number of heads that appear, then a numerical value of 0, 1, or 2 will be assigned to each sample point. Thus, we are concerned only with experiments in which the outcomes either are numerical themselves or have a numerical value assigned to them. The sample space or the induced sample space (in the latter case) is then a space of numbers and the structure of such spaces allows analyses and descriptions that may not be possible in the general case. The number of heads in the two tosses of a coin is referred to as a random variable, which is a random quantity determined, at least in part, by some chance mechanism on the outcome of the experiment. The most frequent use of probability theory is the description of random variables. We will study how random variables are de ned and how to describe their behavior, and, subsequently, how to specify their probability laws to various ways. Suppose that we have an experiment associated with a sample space. Any random variable de ned on the sample space can be referred to as a rule that associates a real number with each element in the sample space. In other words, a random variable is a function whose domain is the sample space and whose range is a set of real numbers. The formal de nition may be stated as follows. De nition 13. A random variable is a numerically valued function whose values correspond to the various outcomes of an experiment. De nition 14. A random variable is said to be discrete if it can only assume a nite or countable number of values; and continuous if it can only assume the in nitely many values corresponding to the points on a line interval. We shall use x to denote a random variable. Example 21. Typical discrete random variables are: 1: Number of unregistered taxicabs in a city. 2: Number of consumers who refuse to answer a telephone survey. 3: Number of defectives on a randomly selected piece of furniture. 4: Number of radioactive particles that pass a register counter in a lab. 5: Number of telephone calls received by a crisis intervention hot-line during a given week. Example 22. Typical continuous random variables are: 1: Prime interest rate. 2: Weight of a package ready to be shipped. 12

3: Volume of orange juice in a glass. 4: Water level of a lake. 5: The winning time for a horse running in the Kentucky Derby. De nition 15. The probability distribution for a discrete random variable is a formula, table, or graph that gives the possible value of x, and the probability p (x) associated with each value of x. is Example 23. Consider an experiment of tossing a pair of fair coins. The sample space S = fhh; HT; T H; T T g: Let x be the number of heads observed. Then we see that Thus, Outcome x HH 2 HT 1 T H 1 T T 0 fx = 0g = ft T g fx = 1g = fht; T Hg fx = 2g = fhhg: Hence, the probabilities corresponding to di erent values of x are given by p(0) = P fx = 0g = 1 4 This can be also displayed in a table p(1) = P fx = 1g = 2 4 = 1 2 p(2) = P fx = 2g = 1 4 : or a histogram x Outcomes in x p (x) 1 0 T T 4 1 1 HT; T H 2 1 2 HH 4 1=2 1=4 1=4 0 1 2 13

Requirements for a Discrete Probability Distribution (1) 0 p (x) 1 (2) X All x p (x) = 1. Example 24. An urn contains 3 white balls and 4 black balls. We draw two balls in succession without replacement from the urn and let x be the number of white balls drawn. Then the sample space is S = fw W; W B; BW; BBg: Since x is the number of white balls drawn, we have Thus, Outcome x W W 2 W B 1 BW 1 BB 0 fx = 0g = fbbg fx = 1g = fw B; BW g fx = 2g = fw W g: Hence, the probability distribution for x is given by p(0) = P fx = 0g = P fbbg = 4 7 3 6 = 2 7 p(1) = P fx = 1g = P fw Bg + P fbw g = 3 7 4 6 + 4 7 3 6 = 4 7 This can be also displayed in a table or a histogram p(2) = P fx = 2g = P fw W g = 3 7 2 6 = 1 7 x Outcomes in x p (x) 2 0 BB 7 4 1 W B; BW 7 1 2 W W 7 4=7 2=7 1=7 0 1 2 14

Example 25. Consider the experiment of tossing a pair of balanced dice. Let x be the sum of the two numbers that appear. The sample space consists of 36 possible outcomes (simple events): (1; 1); (1; 2); (1; 3); (1; 4); (1; 5); (1; 6); Then (2; 1); (2; 2); (2; 3); (2; 4); (2; 5); (2; 6); (3; 1); (3; 2); (3; 3); (3; 4); (3; 5); (3; 6); (4; 1); (4; 2); (4; 3); (4; 4); (4; 5); (4; 6); (5; 1); (5; 2); (5; 3); (5; 4); (5; 5); (5; 6); (6; 1); (6; 2); (6; 3); (6; 4); (6; 5); (6; 6): fx = 2g = f(1; 1)g fx = 3g = f(1; 2); (2; 1)g fx = 4g = f(1; 3); (2; 2); (3; 1)g fx = 5g = f(1; 4); (2; 3); (3; 2); (4; 1)g fx = 6g = f(1; 5); (2; 4); (3; 3); (4; 2); (5; 1)g fx = 7g = f(1; 6); (2; 5); (3; 4); (4; 3); (5; 2); (6; 1)g fx = 8g = f(2; 6); (3; 5); (4; 4); (5; 3); (6; 2)g fx = 9g = f(3; 6); (4; 5); (5; 4); (6; 3)g fx = 10g = f(4; 6); (5; 5); (6; 4)g fx = 11g = f(5; 6); (6; 5)g fx = 12g = f(6; 6)g Hence, the probability distribution for x is given by p(2) = P fx = 2g = 1 36 p(3) = P fx = 3g = 2 36 = 1 18 p(4) = P fx = 4g = 3 36 = 1 12 p(5) = P fx = 5g = 4 36 = 1 9 p(6) = P fx = 6g = 5 36 p(7) = P fx = 7g = 6 36 = 1 6 15

p(8) = P fx = 8g = 5 36 p(9) = P fx = 9g = 4 36 = 1 9 p(10) = P fx = 10g = 3 36 = 1 12 p(11) = P fx = 11g = 2 36 = 1 18 This can be also displayed in a table p(12) = P fx = 12g = 1 36 or a histogram x 2 3 4 5 6 7 8 9 10 11 12 1 1 1 1 5 1 5 1 1 1 1 p (x) 36 18 12 9 36 6 36 9 12 18 36 1=6 5=36 5=36 1=9 1=9 1=12 1=12 1=18 1=18 1=36 1=36 2 3 4 5 6 7 8 9 10 11 12 The Mean and Standard Deviation for a Discrete Random Variable Consider a game that is played as follows: Two fair coins are tossed simultaneously. A person is paid $5 if he gets all heads and he pays out $3 otherwise. The sample space is S = fhh; HT; T H; T T g: Let x be the net amount of money he wins in one play of this game. Clearly, fx = 5g = fhhg fx = 3g = fht; T H; T T g: Since the coins are fair, the probability distribution of x is immediately seen to be x 5 3 1 3 p (x) 4 4 Suppose the player were to play this game four million times and a corresponding number of measurements on x observed. We wish to nd the average value of these four million x-values. That is, what would we expect for the mean value of these four million x-values? 16

According to the probability distribution of x, we would expected approximately one million of the four million repetitions to result in the outcome x = 5 and three million in x = 3. Averaging the four million measurements, we obtain Sum of measurements 4; 000; 000 (1; 000; 000)(5) + (3; 000; 000)( 3) = 4; 000; 000 1; 000; 000 3; 000; 000 = 5 + ( 3) 4; 000; 000 4; 000; 000 = 5 1 4 + ( 3) 3 4 = $1 This means that a person who plays this game over and over again will, on the average, lose $1 per toss of the three fair coins. We should expect, on the average, to lose $1 for every time we play this game. This number we have just computed, 1, is called the expected value or mean of the random variable x. It is a measure of the central tendency of the probability distribution for x. Note that an expected value is not necessarily a possible observed value for the random variable x. This result provides some intuitive justi cation for the de nition of the expected value of a discrete random variable x. The formal de nition of the expected value for a discrete random variable is given as follows. De nition 16 If x is a discrete random variable with probability function p (x), the mean or expected value for x is given by = E (x) = X All x x p (x) where the summation is over all values of the random variable x. Example 26. Two balls are drawn at random without replacement from an urn containing 3 white balls and 4 black balls. As illustrated in Example 24, the probability distribution for x, the number of white balls drawn, is given by Thus, the expected value of x is x 0 1 2 2 4 1 p (x) 7 7 7 = E (x) = 0 2 7 + 1 4 7 + 2 1 7 = 6 7 : Example 27. Two balanced dice are tossed one time and let x denote the sum of the two numbers that occur. In view of the probability distribution for x as shown in Example 25, the expected value for x is 17

= E (x) 1 = 2 36 + 3 1 18 + 4 1 12 + 5 1 9 + 6 5 36 + 7 1 6 5 +8 36 + 9 1 9 + 10 1 12 + 11 1 18 + 12 1 36 = 252 36 = 7: Example 28. A potential customer for a $60; 000 re insurance policy possesses a home in an area that according to experience, may sustain a total loss in a given year with probability 0:0015 and a 50% loss with probability 0:01. Ignoring all other partial losses, what premium should the insurance company charge for a yearly policy in order to break even on all $60; 000 policies of this type? Solution. Let x be the yearly nancial gain to the insurance company resulting from the sale of the policy and let c be the unknown yearly premium. We will calculate the value of c such that the expected gain E (x) is zero. Then c is the premium required to break even. To this value the company would add administrative costs and a margin of pro t. We assume that the expected gain E (x) depends upon c. Using the requirement that the expected gain must equal zero, we must have = E (x) = X All x x p (x) = 0 and then solve this equation for c. First of all, we need to determine the values that the gain x may take. Then we determine the probability p (x). If the event does not occur during the year, the insurance company will gain the premium, c dollars. If the event occurs, then the gain will be negative (a loss) amounting to c 60; 000 dollars for houses sustaining a total loss, and c 30; 000 dollars for 50% loss. The probabilities associated with these three values of x are 0:9885 (= 1 0:0015 0:01), 0; 0015, and 0; 01, respectively. The probability distribution for the gain, x, may be written x c c 60; 000 c 30; 0000 p (x) 0:9885 0:0015 0:01 Setting the expected value of x equal to zero and solving for c, we have E (x) = c 0:9885 + (c 60; 000) 0:0015 + (c 30; 000) 0:01 = 0 or or 0:9885c + 0:0015c 90 + 0:01c 300 = 0 c 390 = 0: 18

Hence, c = $390. Therefore, if the insurance company were to charge a yearly premium of $390 for a $60; 000 re insurance policy, the average yearly gain calculated for a large number of similar policies would be zero. Of course, the actual premium for the policy the insurance company would charge would be $390 plus administrative costs and a margin of pro t. The variance of a set of measurements is the average of the squares of the deviations of a set of measurements about their mean. For the variability of a random variable (or, more precisely, of its probability distribution), we wish to nd the mean or expected value of the function (x ) 2, where is the expected value of x. De nition 17. The variance of a random variable x is de ned to be 2 = V ar (x) = E (x ) 2 = X All x (x ) 2 p (x). where the summation is over all values of the random variable x and = E (x). De nition 18. The standard deviation of a random variable x is the positive square root of its variance and is denoted by. Example 29. Let x be a discrete random variable with probability distribution x 1 2 3 p (x) 0:3 0:4 0:3 By de nition, the mean, variance, and standard deviation of x are = E (x) = 1 0:3 + 2 0:4 + 3 0:3 = 2 2 = V ar (x) = E (x ) 2 = (1 2) 2 0:3 + (2 2) 2 0:4 + (3 2) 2 0:3 = 0:6 and = p 0:6 = 0:7746; respectively. Example 30. Let x be a discrete random variable with probability distribution x 100 200 300 p (x) 0:3 0:4 0:3 By de nition, the mean, variance, and standard deviation of x are = E (x) = 100 0:3 + 200 0:4 + 300 0:3 = 200 19

2 = V ar (x) = E (x ) 2 = (100 200) 2 0:3 + (200 200) 2 0:4 + (300 200) 2 0:3 = 6000 and respectively. = p 6000 = 77:46; Note that the wider spread values of the random variable, the bigger its variance (or standard deviation). In other words, a large variance means a large variability of the random variable while a small variance indicates a small variability of the random variable. Example 31. Toss a single balanced die and let x be the number that appears on the upper face. Then the mean, variance, and standard deviation of x are = E (x) = 1 1 6 + 2 1 6 + 3 1 6 + 4 1 6 + 5 1 6 + 6 1 6 = 21 6 = 3:5 2 = V ar (x) = E (x ) 2 = (1 3:5) 2 1 6 + (2 3:5)2 1 6 + (3 3:5)2 1 6 and respectively. + (4 3:5) 2 1 6 + (5 3:5)2 1 6 + (6 3:5)2 1 6 = 17:5 6 = 2:916667 = p 2:916667 = 1:7078251; HOMEWORK: pp:170 173 4:81; 4:83; 4:85; 4:87; 4:89; 4:91; 4:93; 4:95; 4:97 20