Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Similar documents
Copyright c 2007 Jason Underdown Some rights reserved. quadratic formula. absolute value. properties of absolute values

Copyright c 2007 Jason Underdown Some rights reserved. statement. sentential connectives. negation. conjunction. disjunction

Thou Shalt Not Distribute Powers or Radicals. Copyright c 2010 Jason Underdown Some rights reserved. Thou Shalt Not Split a Denominator

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Discrete Probability Distributions

Lecture 6. Probability events. Definition 1. The sample space, S, of a. probability experiment is the collection of all

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

CSE 312 Final Review: Section AA

Bernoulli Trials, Binomial and Cumulative Distributions

Random Variable. Random Variables. Chapter 5 Slides. Maurice Geraghty Inferential Statistics and Probability a Holistic Approach

Set Notation and Axioms of Probability NOT NOT X = X = X'' = X

Binomial and Poisson Probability Distributions

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Counting principles, including permutations and combinations.

Week 2. Review of Probability, Random Variables and Univariate Distributions

Lecture 16. Lectures 1-15 Review

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

PROBABILITY DISTRIBUTION

STAT/MATH 395 PROBABILITY II

Discrete Random Variables

Learning Objectives for Stat 225

Deep Learning for Computer Vision

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Stochastic Models of Manufacturing Systems

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Northwestern University Department of Electrical Engineering and Computer Science

Random variables (discrete)

Part (A): Review of Probability [Statistics I revision]

Discrete Distributions

Quick review on Discrete Random Variables

MATH 556: PROBABILITY PRIMER

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Discrete Random Variables

Continuous Random Variables and Continuous Distributions

Introduction to Probability and Statistics Slides 3 Chapter 3

2. In a clinical trial of certain new treatment, we may be interested in the proportion of patients cured.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Slides 8: Statistical Models in Simulation

EE514A Information Theory I Fall 2013

Name: Firas Rassoul-Agha

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

Probability Review. Gonzalo Mateos

Topic 3: The Expectation of a Random Variable

Review of Probabilities and Basic Statistics

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

1 Inverse Transform Method and some alternative algorithms

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Statistics for Economists. Lectures 3 & 4

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Lectures on Elementary Probability. William G. Faris

Recap of Basic Probability Theory

MAT X (Spring 2012) Random Variables - Part I

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Lecture 4: Probability and Discrete Random Variables

Recap of Basic Probability Theory

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Lecture 3. Discrete Random Variables

Attempt QUESTIONS 1 and 2, and THREE other questions. penalised if you attempt additional questions.

Week 12-13: Discrete Probability

Probability Dr. Manjula Gunarathna 1

Notes 12 Autumn 2005

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

Guidelines for Solving Probability Problems

Probability: Handout

Probability measures A probability measure, P, is a real valued function from the collection of possible events so that the following

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

3 Multiple Discrete Random Variables

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Dept. of Linguistics, Indiana University Fall 2015

Known probability distributions

Topic 3 - Discrete distributions

Math/Stat 352 Lecture 8

Relationship between probability set function and random variable - 2 -

Fundamental Tools - Probability Theory II

Bernoulli Trials and Binomial Distribution

Data, Estimation and Inference

Things to remember when learning probability distributions:

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Discrete Probability Distributions

Supratim Ray

MATH Notebook 5 Fall 2018/2019

Chapter 1. Sets and probability. 1.3 Probability space

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Discrete Probability Distributions

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

success and failure independent from one trial to the next?

Single Maths B: Introduction to Probability

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Discrete Distributions

Recitation 2: Probability

Review (Probability & Linear Algebra)

STAT 430/510: Lecture 16

Bandits, Experts, and Games

3 Conditional Expectation

Random Variables and Their Distributions

Transcription:

Copyright & License Copyright c 2006 Jason Underdown Some rights reserved. choose notation binomial theorem n distinct items divided into r distinct groups Axioms Proposition axioms of probability probability of the complement Proposition probability of the union of two events conditional probability the multiplication rule Bayes formula

n choose k is a brief way of saying how many ways can you choose k objects from a set of n objects, when the order of selection is not relevant. ( n k ) = n! (n k)! k! Obviously, this implies 0 k n. These flashcards and the accompanying L A TEX source code are licensed under a Creative Commons Attribution NonCommercial ShareAlike 2.5 License. For more information, see creativecommons.org. You can contact the author at: jasonu [remove-this] at physics dot utah dot edu Suppose you want to divide n distinct items in to r distinct groups each with size n 1, n 2,..., n r, how do you count the possible outcomes? If n 1 + n 2 +... + n r = n, then the number of possible divisions can be counted by the following formula: ( ) n n! = n 1, n 2,..., n r n 1! n 2!... n r! (x + y) n = n k=0 ( ) n x k y n k k 1. 0 P (E) 1 If E c denotes the complement of event E, then P (E c ) = 1 P (E) 2. P (S) = 1 3. For any sequence of mutually exclusive events E 1, E 2,... (i.e. events where E i E j = when i j) ( ) P E i = P (E i ) i=1 i=1 If P (F ) > 0, then P (E F ) = P (EF ) P (F ) P (A B) = P (A) + P (B) P (AB) P (E) = P (EF ) + P (EF c ) = P (E F )P (F ) + P (E F c )P (F c ) = P (E F )P (F ) + P (E F c )[1 P (F )] P (E 1 E 2 E 3... E n ) = P (E 1 )P (E 2 E 1 )P (E 3 E 2 E 1 )... P (E n E 1... E n 1 )

independent events discrete random variable cumulative distribution function F properties of the cumulative distribution function Proposition expected value (discrete case) expected value of a function of X (discrete case) Corollary / linearity of expectation variance Bernoulli random variable binomial random variable

For a discrete random variable X, we define the probability mass function p(a) of X by p(a) = P {X = a} mass functions are often written as a table. Two events E and F are said to be independent iff P (EF ) = P (E)P (F ) Otherwise they are said to be dependent. The cumulative distribution function satisfies the following properties: 1. F is a nondecreasing function 2. lim a F (a) = 1 3. lim a F (a) = 0 The cumulative distribution function (F ) is defined to be F (a) = p(x) all x a The cumulative distribution function F (a) denotes the probability that the random variable X has a value less than or equal to a. If X is a discrete random variable that takes on the values denoted by x i (i = 1... n) with respective probabilities p(x i ), then for any real valued function f E[f(X)] = n f(x i )p(x) i=1 E[X] = xp(x) x:p(x)>0 If X is a random variable with mean µ, then we define the variance of X to be var(x) = E[(X µ) 2 ] = E[X 2 ] (E[X]) 2 = E[X 2 ] µ 2 If α and β are constants, then E[αX + β] = αe[x] + β The first line is the actual definition, but the second and third equations are often more useful and can be shown to be equivalent by some algebraic manipulation. Suppose n independent Bernoulli trials are performed. If the probability of success is p and the probability of failure is 1 p, then X is said to be a binomial random variable with parameters (n, p). The probability mass function is given by: ( ) n p(i) = p i (1 p) n i i where i = 0, 1,..., n If an experiment can be classified as either success or failure, and if we denote success by X = 1 and failure by X = 0 then, X is a Bernoulli random variable with probability mass function: p(0) = P {X = 0} = 1 p p(1) = P {X = 1} = p where p is the probability of success and 0 p 1.

properties of binomial random variables Poisson random variable properties of Poisson random variables geometric random variable properties of geometric random variables negative binomial random variable properties of negative binomial random variables probability density function of a continuous random variable probability density function of a uniform random variable properties of uniform random variables

A random variable X that takes on one of the values 0,1,..., is said to be a Poisson random variable with parameter λ if for some λ > 0 where i = 0, 1, 2,... p(i) = P {X = i} = λi i! e λ If X is a binomial random variable with parameters n and p, then E[X] = np var(x) = np(1 p) Suppose independent Bernoulli trials, are repeated until success occurs. If we let X equal the number of trials required to achieve success, then X is a geometric random variable with probability mass function: p(n) = P {X = n} = (1 p) n 1 p If X is a Poisson random variable with parameter λ, then E[X] = λ var(x) = λ where n = 1, 2,... Suppose that independent Bernoulli trials (with probability of succes p) are performed until r successes occur. If we let X equal the number of trials required, then X is a negative binomial random variable with probability mass function: ( ) n 1 p(n) = P {X = n} = p r (1 p) n r r 1 If X is a geometric random variable with parameter p, then E[X] = 1 p var(x) = 1 p p 2 where n = r, r + 1,... We define X to be a continuous random variable if there exists a function f, such that for any set B of real numbers P {X B} = f(x)dx The function f is called the probability density function of the random variable X. B If X is a negative binomial random variable with parameters (p, r), then E[X] = r p var(x) = r(1 p) p 2 If X is a uniform random variable with parameters (α, β), then E[X] = α + β 2 (β α)2 var(x) = 12 If X is a uniform random variable on the interval (α, β), then its probability density function is given by f(x) = { 1 β α if α < x < β 0 otherwise