Counting principles, including permutations and combinations.

Similar documents
STATISTICS 1 REVISION NOTES

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Recitation 2: Probability

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Learning Objectives for Stat 225

Advanced Herd Management Probabilities and distributions

When working with probabilities we often perform more than one event in a sequence - this is called a compound probability.

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Statistics 1. Edexcel Notes S1. Mathematical Model. A mathematical model is a simplification of a real world problem.

8 Probability. An event occurs if, when the experiment is performed, the outcome is in that event.

UNIT Explain about the partition of a sampling space theorem?

The Union and Intersection for Different Configurations of Two Events Mutually Exclusive vs Independency of Events

Math Review Sheet, Fall 2008

Deep Learning for Computer Vision

Probability and Probability Distributions. Dr. Mohammed Alahmed

MATH 118 FINAL EXAM STUDY GUIDE

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

Class 26: review for final exam 18.05, Spring 2014

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

Chapter 1: Revie of Calculus and Probability

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Lecture notes for probability. Math 124

Discrete Probability. Chemistry & Physics. Medicine

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Sets and Set notation. Algebra 2 Unit 8 Notes

Review for Exam Spring 2018

CME 106: Review Probability theory

3/30/2009. Probability Distributions. Binomial distribution. TI-83 Binomial Probability

(It's not always good, but we can always make it.) (4) Convert the normal distribution N to the standard normal distribution Z. Specically.

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

What is Probability? Probability. Sample Spaces and Events. Simple Event

Lecture 16. Lectures 1-15 Review

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

Probability- describes the pattern of chance outcomes

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Statistical Inference

Topic 4 Probability. Terminology. Sample Space and Event

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Probability Dr. Manjula Gunarathna 1

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Binomial and Poisson Probability Distributions

M378K In-Class Assignment #1

Review of Probabilities and Basic Statistics

Review Notes for IB Standard Level Math

Probability Year 10. Terminology

Mark Scheme (Results) Summer 2009

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Special distributions

20 Hypothesis Testing, Part I

Lecture 4: Random Variables and Distributions

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

1 INFO Sep 05

STA Module 4 Probability Concepts. Rev.F08 1

2011 Pearson Education, Inc

Probability Year 9. Terminology

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Essentials of Statistics and Probability

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

If S = {O 1, O 2,, O n }, where O i is the i th elementary outcome, and p i is the probability of the i th elementary outcome, then

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Algebra, Functions, and Data Analysis Vocabulary Cards

Executive Assessment. Executive Assessment Math Review. Section 1.0, Arithmetic, includes the following topics:

Introduction to Probability. Experiments. Sample Space. Event. Basic Requirements for Assigning Probabilities. Experiments

Name: Firas Rassoul-Agha

Probability Theory and Applications

Standard & Conditional Probability

Statistics for Engineers Lecture 2

Keystone Exams: Algebra

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

This does not cover everything on the final. Look at the posted practice problems for other topics.

Continuous Distributions

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Bivariate distributions

STAT Chapter 3: Probability

Statistics for Managers Using Microsoft Excel (3 rd Edition)

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

Calculus first semester exam information and practice problems

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Combinatorics and probability

Chapter 4. Probability-The Study of Randomness

Statistics 1. Revision Notes

University of Jordan Fall 2009/2010 Department of Mathematics

Unit 4 Probability. Dr Mahmoud Alhussami

AP Statistics Cumulative AP Exam Study Guide

A Event has occurred

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Q1 Own your learning with flash cards.

Algebra II Vocabulary Cards

Lecture 2: Probability and Distributions

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Transcription:

1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each of these there are n different ways of performing a second independent operation, then there are mn different ways of performing the two operations in succession. The product principle can be extended to three or more successive operations. The number of different ways of performing an operation is equal to the sum of the different mutually exclusive possibilities. COUNTING PATHS The word and suggests multiplying the possibilities The word or suggests adding the possibilities. If the order doesn't matter, it is a Combination. If the order does matter it is a Permutation. PERMUTATIONS (order matters) A permutation of a group of symbols is any arrangement of those symbols in a definite order. Permutations of n different object : n! Explanation: Assume you have n different symbols and therefore n places to fill in your arrangement. For the first place, there are n different possibilities. For the second place, there are n 1 possible symbols, until we saturate all the places. According to the product principle, therefore, we have n (n 1)(n 2)(n 3) 1 different arrangements, or n! Wise Advice: If a group of items have to be kept together, treat the items as one object. Remember that there may be permutations of the items within this group too. Permutations of k different objects out of n different available (no repetition allowed) : P n n! k = = n n 1 n k + 1 n k! Good logic to apply to similar questions straightforward: Suppose we have 10 letters and want to make groups of 4 letters. For four-letter permutations, there are 10 possibilities for the first letter, 9 for the second, 8 for the third, and 7 for the last letter. We can find the total number of different four-letter permutations by multiplying 10 x 9 x 8 x 7 = 5040. Permutations with repetition of k different objects out of n different available = n k There are n possibilities for the first choice, THEN there are n possibilities for the second choice, and so on, multiplying each time.) COMBINATIONS (order doesn t matters) It is the number of ways of choosing k objects out of n available given that The order of the elements does not matter. The elements are not repeated [such as lottery numbers (2,14,15,27,30,33)] The easiest way to explain it is to: assume that the order does matter (ie permutations), then alter it so the order does not matter. Since the combination does not take into account the order, we have to divide the permutation of the total number of symbols available by the number of redundant possibilities. k selected objects have a number of redundancies equal to the permutation of the objects k! (since order doesn t matter)

2 However, we also need to divide the permutation n! by the permutation of the objects that are not selected, that is to say n k!. n! k! n k! ( n k ) C k n n C k Binomial Expansion/Theorem n = n! n n 1 n 2 n k + 1 = k! n k! k! a + b n = ( n k ) an k b k = a n + ( n 1 ) an 1 b + + ( n k ) an k b k + + b n k=0 Binomial Coefficient ( n k ) is the coefficient of the term containing an k b k in the expansion of a + b n n ε N ( n 0 ) 1 0! 1 ( n n n 1 n 2 n k + 1 ) = k k! = n! k! n k! = n! n k! k! = ( n n k ) The general term or k + 1 th term is: T k+1 = ( n k ) an k b k The constant term is the term containing no variables. When finding the coefficient of x n always consider the set of all terms containing x n Probability The number of trials is the total number of times the experiment is repeated. The outcomes are the different results possible for one trial of the experiment. Equally likely outcomes are expected to have equal frequencies. The sample space, U, is the set of all possible outcomes of an experiment. And event is the occurrence of one particular outcome. P A = n A n U Complementary Events Two events are described as complementary if they are the only two possible outcomes. Two complementary events are mutually exclusive. Since an event must either occur or not occur, the probability of the event either occurring or not occurring must be 1. P A + P A = 1 Use when you need probability that an event will not happen P(A) is the probability of an event A occurring in one trial, n(a) is the number of times event A occurs in the sample space n(u) is the total number of possible outcomes. Possibility when we are interested in more than one outcome (events are and, or, at least )

3 Combined Events union either intersection both/and Given two events, B and A, the probability of at least one of the two events occurring, P A B = P A + P B P A B either A or B or both P(A) includes part of B from intersection P(B) includes part of A from intersection P A B (both A and B) was counted twice, so one has to be subtracted It is important to know how to get P A B For mutually exclusive events (no possibility that A and B occurring at the same time) P A B = P A + P B Turning left and turning right (you can't do both at the same time) Tossing a coin: Heads and Tails P A B = For non - mutually exclusive we are going to find conditional probability for Independent and Dependent Events A bag contains three different kinds of marbles: red, blue and green. You pick the marble twice. Probability of picking up the red one (or any) the second time depends weather you put back the first marble or not. Independent Events: Dependent Events: the probability that one event occurs in no way affects the probability of the other event occurring. You put the first marble back Conditional Probability: probability of one event occurring influences the likelihood of the other event You don t put the first marble Given two events, B and A, the conditional probability of an event A is the probability that the event will occur given the knowledge that an event B has already occurred. This probability is written as (notation for the probability of A given B) P (A B ) Probability of the intersection of A and B (both events occur) is: P A B = P B P A B Independent Events: Dependent Events: P A B = P A = P A B A does not depend on B nor on B P A B = P A P B P A B = P B P A B P A B calculated depending on the event B P A B = P B P A B P A B = P A B P B

4 Use of Venn diagrams, tree diagrams and tables of outcomes to solve problems. 1. Venn Diagrams The probability is found using the principle P A = n A n U 2. Tree diagrams A more flexible method for finding probabilities is known as a tree diagram. This allows one to calculate the probabilities of the occurrence of events, even where trials are non-identical (where P A A P A ), through the product principle. Bayes Theorem P A B = P B P A B P A B = P A P B A P A B = P A B P B = P A P B A P B Bayes theorem Another form of Bayes theorem (Formula booklet) From tree diagram: there are two ways to get A, either after B has happen or after B has not happened: P A = P B P A B + P B P A B P B A = P B P A B P B P A B + P B P A B

5 Extension of Bayes Theorem If there are more options than simply B occurs or B doesn t occur, for example if there were three possible outcomes for the first event B1, B2, and B3 Probability of A occurring is: P B 1 P A B 1 + P B 2 P A B 2 + P B 3 P A B 3 P B i P A B i P B i A = P B 1 P A B 1 + P B 2 P A B 2 + P B 3 P A B 3 Outcomes B1, B2, and B3 must cover all the possible outcomes. Descriptive Statistics Concepts of population, sample, random sample and frequency distribution of discrete and continuous data. A population is the set of all individuals with a given value for a variable associated with them. A sample is a small group of individuals randomly selected (in the case of a random sample) from the population as a whole, used as a representation of the population as a whole. The frequency distribution of data is the number of individuals within a sample or population for each value of the associated variable in discrete data, or for each range of values for the associated variable in continuous data. new guidelines in IB MATH: population sample Presentation of data: frequency tables and diagrams Grouped data: mid-interval values, interval width, upper and lower interval boundaries, frequency histograms. Mid interval values are found by halving the difference between the upper and lower interval boundaries. The interval width is simply the distance between the upper and lower interval boundaries. Frequency histograms are drawn with interval width proportional to bar width and frequency as the height. Median, mode; quartiles, percentiles. Range; interquartile range; variance, standard deviation. Mode (discrete data) is the most frequently occurring value in the data set. Modal class (continuous data) is the most frequently occurring class. Median is the middle value of an ordered data set. For an odd number of data, the median is middle data. For an even number of data, the median is average of two middle data.

6 Percentile is the score bellow which a certain percentage of the data lies. Lower quartile (Q1) is the 25 th percentile. Median (Q2) is the 50 th percentile. Upper quartile (Q3) is the 75 th percentile. Range is the difference between the highest and lowest value in the data set. The interquartile range is Q3 Q1. Cumulative frequency is the frequency of all values less than a given value. The population mean, μ is generally unknown but the sample mean, x used to serve as an unbiased estimate of this mean. That used to be. From now on for the examination purposes, data will be treated as the population. Estimation of mean and variance of population from a sample is no longer required. Discrete and Continuous Random Variables A variable X whose value depends on the outcome of a random process is called a random variable. For any random variable there is a probability distribution/ function associated with it. Probability distribution/ function Discrete Random Variables P(X = x), the probability distribution of x, involves listing P(x i ) for each x i. 1. 0 P X = x 1 2. P X = x = 1 x 3. P X = x n = 1 P X = x k k n [P event x n occurs = 1 P any other event occurs ] Median given by middle term; if even number of terms, average of 2 middle terms. Same applies for Q1, Q2 and IQR. It is the value of X such that P X x 1 and P X x 1 2 2 The mode is the value of x with largest P X = x which can be different from the expected value ALWAYS watch out for conditional probability (i.e. P(x) given that); it is often implied and not stated E X = μ = x P X = x p i x i = f i x i f i If a is constant, then E ax = ae X If a and b are constants, then E ax + b = ae X + b E X + Y = E X + E Y E[ g X ] = g x P X = x Var X = σ 2 = E X μ 2 = f i x i μ 2 f i σ 2 = E X 2 μ 2 Var ax + b = a 2 Var X Var[X + Y] = Var X + Var Y true only for independent

7 Continuous Random Variables X defined on a x b probability density function (p.d.f.), f (x), describes the relative likelihood for this variable to take on a given value cumulative distribution function c.d.f.), F t, is found by integrating the p.d.f. from minimum value of X to t t F t = P X t = f x dx a For a function f x to be probability function, it must satisfy the following conditions: 1. f x 0 for all x ε a, b b 2. f x = 1 a d 3. for any a c < d b, P c < X < d = f x dx c For a continuous random variable, the probability of any single value is zero P X = c = 0 P c X d = P c < X < d = P c X < d etc. m Median a number m such that f x = 1/2 a Mode: max on f x, a < x < b (which may not be unique). ALWAYS watch out for conditional probability (i.e. P(x) given that); it is often implied and not stated E X = μ = x f x dx If a is constant, then E ax = ae X If a and b are constants, then E ax + b = ae X + b E X + Y = E X + E Y E[ g X ] = g x P X = x b Var X = σ 2 = x 2 f x dx [ xf x dx a a σ 2 = E X 2 μ 2 Var ax + b = a 2 Var X Var[X + Y] = Var X + Var Y true only for independent Standard deviation of X: σ = Var X b ] 2

CALCULATOR Binomial Distribution X ~ B(n, p) n is number of trials There is either success or failure p is the probability of a success (1 p) is the probability of a failure. P(X = x) = ( n x ) px (1 p) n x E(X) = μ = np Var(X) = σ 2 = np(1 p) x = 0, 1,, n In a given problem you write: X ~ B(100, 0.5) P(x 52) = 0.6913502844 BinomPDF(trials, probability of event, value) o Gives the probability for a particular number of success in n trials exactly: P(X = x ) = binompdf (n, p, x) BinomCDF(trials, probability of event, value) o Gives cumulative probability, i.e. the number of successes within n trials is at most the value at most: P(X x) = binomcdf(n, p, x) at least: P(X x) = 1 binomcdf (n, p, x 1). Poisson Distribution X ~ Po(m) The average/mean number of occurrences (m) is constant for every interval. The probability of more than one occurrence in a given interval is very small. The number of occurrences in disjoint intervals are independent of each other. P(X = x) = mx e m, x = 0, 1, 2, x! E(X) = m Var(X) = m In a given problem you write: X ~ Po(0.325) P(x 6) = 1 P(x 5) P(x 6) =.3840393444 If X ~ Po(l), and Y ~ Po(m), X + Y~ Po(l + m) PoissonPDF(mean, value) o Gives probability of a particular number of occurrences within a time period exactly: P(X = x ) = poissonpdf (m, x) PoissonCDF(mean, value) o Gives cumulative probability, i.e. probability of at most (value) occurrences within a time period at most: P(X x) = poissoncdf(m, x) at least: P(X x) = 1 poissoncdf (m, x 1).

Normal distribution. X ~ N(μ, σ 2 ) f(x) = 1 σ 2π e 1 2 (x μ σ )2 < x < f(x)dx = 1 P(x x 1 ) = area = P(x x 2 ) = area = x 1 x 2 f(x)dx f(x)dx x 2 P(x 1 x x 2 ) = area = f(x)dx x 1 P(X = x 1 ) = 0 P(x 1 X x 2 ) = P(x 1 < X < x 2 ) = P(x 1 X < x 2 ) etc. In a given problem you write: μ = 70 SD = 4.5 X ~ N(70, 20.25) P(65 X 80) = 0.8536055925 probability is 85.4%. In a given problem you write: μ = 2870; σ = 900 P(x a) = 0.3409 a = 2500 NormalCDF(lower, upper, mean, SD) o Gives probability that a value is within a given range o percentage of area under a continuous distribution curve from lower bound to upper bound. P(x 1 X x 2 ) = normalcdf ( x 1, x 2, μ, σ) P(x x) = normalcdf ( 1E99, x, μ, σ) P(X x ) = normalcdf ( x, 1E99, μ, σ) InvNorm(probability, μ, σ) o Given the probability, this function returns the x value region to the left of x value. P(x a) = Ф(a) = area (probability) a = InvNorm(probability, μ, σ) Standardized normal distribution Z = X μ σ Z ~ N(0, 1) (z score ) X = μ + Zσ it tells us where is the value of X in fractions of σ NormalCDF(lower, upper) o Gives probability that a value is within a given range P(z 1 Z z 2 ) = normalcdf ( z 1, z 2 ) P(Z z ) = normalcdf ( 1E99, z) P(Z z ) = normalcdf (z, 1E99) invnorm(probability) o Given a probability, gives the corresponding z-score. P(z b) = Ф(b) a = InvNorm(Ф(b)) P(Z b) = P(Z < b) bc P(Z = b) = 0

8