Multinomial Allocation Model
|
|
- Rhoda Haynes
- 5 years ago
- Views:
Transcription
1 Multinomial Allocation Model Theorem 001 Suppose an experiment consists of independent trials and that every trial results in exactly one of 8 distinct outcomes (multinomial trials) Let : equal the probability of outcome on any trial and let G equal the number of times outcome occurs in these multinomial trials Then EG G G œ EG / ", á, 8 ^ ", á, ^ 8 œ where ^ßáß^ are independent Poisson random variables such that ^ :, œ ", á, 8 That is " 8 has parameter T^ œ D œ / 3 3 : 3 : D x 3 3 D 3 D3 ß "ß á Theorem 002 Let H5 equal the number of G's which equal 5, 5 ß"ßáß in the above model " with : œ for œ "ß á ß 8 Then 8 8 EG HßHß œ " EG ) / )/ á ß H ß ß ß á 8 8 ^ß^ß " á œ ) œ ) " 8x where ^ß^ßáare independent and ^ µ Poisson for œß"ßá Theorem 003 Consider a fixed set of nonnegative integers ;ßáß; " 8 If G 3 ; 3 after multinomial trials we will say outcome 3 has reached its quota (by time ) Let [ <U : represent the waiting time (ie the smallest value of ) until exactly < of the 8 different possible outcomes of each multinomial trial have reached their quota Then T [ œ <U : / T^", á, ^8 U< : œ
2 and 5 5" <U : " 8 U : < E[ œ 5 T^, á, ^ d where (1) T is the event that ^ ; (2 Ñ is the event that at least 8<" of the (independent) events T,, T occur and U< : " á 8 (3) ^ ßáß^ are independent, ^ µ Poisson : and : á : œ "Þ " 8 " 8 This result can be generalized to what are known in the literature as sooner and later waiting time problems Consider an experiment consisting of repeated multinomial trials, each consisting of 8= distinct outcomes We will assume that 8of the outcomes have been classified as type E outcomes and that the remaining = outcomes as type F outcomes Let ; ßáß; ß; ßáß; be a fixed set of nonnegative integers Let G œg equal the number of times outcome 3 occurs in the first trials If G 3 ; 3 after multinomial trials we will say outcome 3 has reached its quota (by time ) " 8 8" 8= 3 3 Let [ < < U represent the waiting time until exactly < " of the type E outcomes exactly < # of the type F outcomes have reached their quota ( ie a sooner waiting time) " #: or Let [ < < U represent the waiting time until exactly < " of the type E outcomes " #: and exactly < # of the type F outcomes have reached their quota ( ie a later waiting time) Then T [ œ < < U / T^ á ^ U < < " # : ",, 8= : " # œ
3 5 < < : U " 8= U : < < 5" E[ œ 5 T^, á, ^ d " # " # T [ œ < < U / T^ á ^ U < < " # : ",, 8= : " # œ and 5 < < : U " 8= U : < < 5" E[ œ 5 T^, á, ^ d " # " # where (1) T is the event that ^ ; (2) U :< < is the event that at least 8< " " of the (independent) events " # T, á, T occur or at least =< " of the (independent) events T, á, T occur " 8 # 8" 8= (3) U :< < is the event that at least 8< " " of the (independent) events " # T", á, T8 occur and at least =< # " of the (independent) events T, á, T occur and 8" 8= () ^ ßáß^ are independent, ^ µ Poisson : and : á : œ "Þ " 8= " 8 These results could be further extended in an obvious way to allow for the possibility of splitting the set of possible outcomes into more than two types Theorem 00 Suppose an experiment consists of repeated identical and independent trials where each trial results in exactly one of 8" distinct outcomes E ßE" ßáßE 8 Let : œte 2 for œ ß "ß á ß 8 Assume the trials continue until the 5 occurrence of outcome E
4 Let G, "ßáß8 equal the number of times outcome E is observed before the trials stops Then " 5" ) EGG", á, G8 œ EG^", á, ^ 8) / ) 5"x where ^ßáß^ are independent, ^ µ Poisson " 8 Note: In addition to its direct use in waiting time problems where the stopping rule depends on one distinctive outcome achieving its quota, Theorem 00 can also be used when the stopping rule depends on multiple outcomes achieving their quotas by partitioning the problem according to which outcome occurs last Example 00 illustrates such an application ): : Theorem 005 Suppose we perform multinomial trials with 8< distinct possible outcomes We will assume that 8of the outcomes have been classified as type E outcomes and that the remaining < outcomes as type F outcomes We will assume such that all type E outcomes occur with probability :" and all the type F outcomes occur with probability : # Let G equal the number of type E outcomes which occur exactly times in the trials and let H equal the number of type F outcomes which occur exactly times in the trials, ß "ß á ß Then, EGG ßH, G ßH ßá, G ßH ß ß ß ß ßá " " 8 < )/ œ / : " 7/ : # EG [ ß^ [ ß^ ß 8 <, " " á ) 7 œ )7 œ œ ) : 7: x x " # where [ µ Poisson ", ^ µ Poisson #, 8: <: œ " and where all random variables are independent Theorem 006 Suppose we perform multinomial trials with 8< distinct possible outcomes We will assume that 8of the outcomes have been classified as type E outcomes and that the remaining < outcomes as type F outcomes We will assume such that all type E
5 outcomes occur with probability :" and all the type F outcomes occur with probability : # Let G equal the number of type E outcomes which occur exactly times in the trials 2 and let ], "ß á ß < equal the number of times the type F outcome in the first trials Then, EGG, G ßá, G ßßßá, ] ß á, ] " " < 8 )/ œ / : " 7: [ [ ß [ ßßßá ^ ß ^ 8 " " < ) # E G, á,, á, œ ) œ ) : x # " # " where [ µ Poisson, ^ µ Poisson :, 8: <: œ " and where all random variables are independent Examples for Section 00 Example 001 Joint moments of count size distribution in multinomial trials Suppose an experiment consists of independent trials and that every trial results in exactly one of 8 distinct outcomes (multinomial trials) Let : equal the probability of outcome on any trial and let G equal the number of times outcome occurs in these multinomial trials Then E G â G œ x α " α 8 α : á x â : " 8 α α " 8 " α8 " 8 This result is given in Johnson and Kotz [0] Proof If we take GG, á, G œ G âg " 8 " α 8 α " 8
6 in Theorem 001 then EGG", á, G 8 œ / E^ â^ " α" 8 α8 œ / E^ âe^ " α" 8 α8 using the independence of the ^3's where ^ 3 µ Poisson : 3 We note that E\ + œ $ for \µ Poisson $ and the final form follows on using this result and extracting the coefficient of œ œ + Example 002 Fixed number of multinomial outcomes selected a fixed number of times within both of two types Suppose we perform multinomial trials with 8< distinct possible outcomes We will assume that 8of the outcomes have been classified as type E outcomes and that the remaining < outcomes as type F outcomes We will assume that all type E outcomes occur with constant probability :" and that all type F outcomes occur with constant probability : so that 8: <: œ " # " # Let G equal the number of type E outcomes which occur exactly times in the trials and let H equal the number of type F outcomes which occur exactly times in the trials, ß "ß á ß Then, x TG 3 œ 5 œ " 8 : ": 8 3x 38 x " " œ and EG âg H âh α α " " 8x <x x œ 8 αx < " x E E x x â x α " α " α " E E E E " # " # : : ": α : " α " α "
7 where α œ α á α, " œ" á ", Eα œ α á α, and E " œ " á " The first result is given in Charalambides [0] David and Barton [0] give the first result for the special case 3œ Charalambides [0] gives the second result for the special case α œ@, α œ for 3Á5and " œ for all 5 We note that \ " by definition ÐÑ Proof For the first result let Z equal the number of balls in urn and let GZ Z œ " exactly 5 of Z " ßáßZ 8 equal 3 ", á, 8< else By Theorem 001 TG œ 5 œ EG Z, á, Z 3 " 8< œ / T 5 ^ ßáß^ 3 exactly of " 8 equal œ œ / T^ œ 3 "T^ œ 3 5 œ where ^µ Poisson : " The final form follows on substituting for T^œ3 and extracting the coefficient of For the joint factorial moment result, we have by Theorem 005 that EG âg H âh α α " " 8 < )/ œ / : " 7/ : # E[ â [ ^ â^ ) 8 7< α α " " œ )7 œ œ 8 < )/ œ / : " 7/ : # E[ â E[ E^ âe^ ) 8 7< α α " " œ )7 œ œ
8 using the independence of the random variables [ß[ßâß^ß^ßâ where " " 3 3 ) : " 7: # [ 3 µ Poisson and ^ 3 µ Poisson 3x 3x + We note that E\ + œ $ for \ µ Poisson $ and the final form follows on using this < 8 result and extracting the coefficient of 7 ) Example 003 A selection of waiting time problems In each of the following parts of this example let [ <U : represent the waiting time until exactly < of the 8 different possible outcomes of each multinomial trial have reached their quota (a) An urn contains 7 distinct solid colored balls distinct striped balls (so all 7@ balls in the urn are distinct) You continue to randomly sample with replacement from this urn until you observe some = of the 7 different solid colored balls at least once each This is a variation of the classical coupon collector' s problem In this case, 7 5 : œ7<" 5 5 7<" 7 " " E[ œ 5x7@ <U " 7< This result is given in Charalambides [0] (b) Newman and Shepp [0] consider the same model as in (a) except that sampling continues until one observes some = of the 7 different solid colored balls at least twice each Newman and Shepp note that this situation occurs when trading is allowed between two cooperative collectors Newman and Shepp's solution is only for the case 5œ" and is in a complicated form An easier form to work with for general 5is, E [ <U : œ57@ 7<" 7 " œ7<" 3œ " " 53" x 7< 3 53 (c) Suppose a multinomial experiment consists of 7@ distinct outcomes and of these outcomes, 7 can be classified as type T outcomes with probabilities :", á,: 7 and quotas ;, á,; Assume that these multinomial trials continue until any one of the type " 7
9 T outcomes has met its quota This is a generalization of the classical birthday problem In this case, ;" ; " : œ œ " 7 " 7 E " 7 5 " á 7 5 " x : " â : 7 ["U œ 5 â " xâ7x " á75 : á : " 7 This result was derived by Klamkin and Newman [0] for 5œ" and extended to general 5 " by Dwass [0] The case 7 œ œ, :" œ :# œ # and ;" œ ;# œ ; is often referred to as Banach' s match box problem (see Feller [0] and Holst [0]) (d) The game of one person Knock' m Down is described in full in Benjamin, Fluet and Huber [0] but the basic idea of the game is to choose ;", á,; 8 for a fixed value of ;" á ; 8 that minimizes E[ 8: U, that is the expected waiting time for all outcomes to meet their quota In this case 7 ; " ; " " " 8U : ) œ" œ E[ œ â Ð" " á : â " xâ x : á : á@ " " where ; <, á,; < are the nonzero quotas amongst ;", á,; 8 and is the set of all " 7 element subsets of <ßáß< " 7 This result allows for the calculation of E [ 8: U for any given quota vector Ubut does not aid in finding that U which minimizes E[ 8U : for a fixed value of ;" á ; 8 short of calculating E[ 8: U for all possible UÞ The paper by Benjamin, Fluet and Huber [0] gives some results for reducing the class of feasible U Benjamin and Fluet [0] posed but left open the question of how to choose a quota vector Uthat maximizes TUfinishes before U w for a given opponent quota vector U w in a two person version of Knock' m Down We point out but do not further consider here that an algorithm for computing TUfinishes before U w can be devised by generalizing the ideas given in Thorp [0] The nontransitive probabilities that arise in two person Knock' m Down makes the game especially interesting Proof For part (a) we apply Theorem 003 with 8 œ and U œ "ß áß", ß áß In this case we have
10 E [ 5 <U : œ 5 5" Tat least 7@<" of the 7@ events ] "á],, "], á],, are true d " 7 7" 7@ œ5 Tat least 7<" of events ]" ", á,] 7 " are trued 5" 5" 7<" " 7 œ5 " T ] œ < d 7 7 œ7<" where ]µ Poisson 7@ For part (b) we apply Theorem 003 with 8œ7@ and U œ #ß áß#, ß áß In this case we have E [ 5 <U : œ 5 5" Tat least 7@<" of the 7@ events ] #á],, #], á],, are true d " 7 7" 7@ œ5 Tat least 7<" of events ]" #, á,] 7 # are trued 5" 5" 7<" " 7 œ5 " T ] Ÿ" < d 7 7 œ7<" where ]µ Poisson 7@ In both parts (a) and (b) the final form follows on simplification
11 For part (c) we apply Theorem 003 with and Uœ ;ß " áß; 7, ßáß In this case we have E [ 5 "U : 5" œ 5 Tat least 7@ of the 7@ events ] ;, á,] ;, ], á,] are true d " " 7 7 7" 7@ œ5 Tat least 7of the 7 events ]" ; ", á,] 7 ; 7 are trued 5" 5" œ5 T ] 3 ; 3d 7 3œ" where ] 3 µ Poisson: 3 The final form follows on simplification Part (d) is a direct application of Theorem 003 Example 00 Some more involved waiting time problems Suppose identical balls are independently distributed into 8 equally likely boxes ( ie a multinomial model or classical allocation scheme) until any 5 of the 8 boxes have at least 7balls each Let denote the number of boxes containing balls when the distribution of balls into boxes stops Let ^µpoisson ) TQ@ " œ = œ 8x 7"x85x=x5="x T^ Ÿ 7" 85 = 5=" 7" ) T^œ7 T^ 7" ) / ) for = ß á ß 5 " and equals else In the œ 7
12 EQ < œ T^ 85 < ) T^œ@ ) / ) Analogous results for the be worked out but are not given here The special simplifies to 5=" 5=" TQ" " œ = œ " 3 œ 3œ 8x = 3x " 85x=x5=" x 85=" =3" and 5<" 5<" EQ" " < œ " œ 8x<x " 85 x 5<" x 85<" <" Holst [0], [0] considers the special and additionally assumes 5œ8but the form of the solutions given in his papers is complicated Proof TQ " œ = œ T 2 Q " œ = and box is last box to receive a 8 œ" œ8tq "œ= and œ8tt " = box is last box to receive a ball box is last box to receive a ball where T is the event that exactly 5" of boxes 2 through 8have at least 7balls and exactly = of boxes 2 through 8have exactly 7 balls
13 Stated in this way we can view the problem as asking for the probability of event T = where the stopping rule is to stop when the " box has exactly 7balls The purpose of casting the problem in this way is to make the application of Theorem 00 appropriate Let G equal the number of balls distributed into box "ß "ßáß8", before the trials stop and define GG G œ " ", á, 8" exactly 5" of G", á, G8" are 7 and exactly = of G, á, G equal 7 else " 8" It then follows from Theorem 00 that TQ@ "œ= œ8 " exactly 5" of ^ ", á, ^8" are 7 5" ) T ) / ) 7" x and exactly = of ^, á, ^ equal 7 " 8" where ^ßáß^ " 8 are iid Poisson ) The final form follows on recognizing that exactly 5" of ^", á, ^8" are at least 7 and exactly = of ^", á, ^8" equal 7if and only if exactly 85of ^", á, ^8" are less than 7, exactly = of ^", á, ^8" equal 7, and exactly 5=", á, ^ exceed 7 of ^" 8" The factorial moment result follows from definition We have EQ " œ = TQ " œ < 5" =œ< 8x œ 7" x 85 x T ^ Ÿ 7" ) / 5" 85 7" ) " = 5=" T ^œ7 T ^ 7" ) =< x 5=" x =œ< and the final form follows on applying the binomial theorem to remove the sum The results for the special follow immediately on simplification Example 005 Sooner and later waiting time problems : how many children? Consider parents who continue to have children until they get ;" boys or ;# girls, whichever comes first Let Y equal the number of children they will have
14 Alternatively, consider parents who continue to have children until they get ;" and ; girls Let Z equal the number of children they will have Then, # boys ;" ;" " # 5 EY œ 5x 53"x 3 : ": 3xx5 " x 3œ œ and E Z 5 œ 5x 5; " " 5x 5; # " E : 5 ": 5 Y where :œt boy and ":œtgirl Ebneshahrashood and Sobel [0] derive expressions for both of the above expectations for 5œ" and 5œ# However their solutions are in a complex form involving G and H incomplete Dirichlet integrals of type 2 as defined in Sobel, Uppuluri and Frankowski [0] Ling [0] derives a recurrence relation for both of the above expectations for the case 5œ" Proof Both parts of this problem are applications of T heorem 003 with 8œ" Type E outcome (boys), =œ" Type F outcome (girls), :œt boy, ":œtgirl In the first part we are waiting for < " œ" of the 8œ" Type E outcomes or < # œ" of the =œ" Type Foutcomes By Theorem < < : U " 8= U : < < E 5 5" Y œ E[ œ 5 T^, á, ^ d " # " # 5" œ 5 T^, á, ^ d " 8= U :< < œ 5 T^" ;", ^# ;# d 5" " # where ^ " µ Poisson : and ^# µ Poisson ": The final form follows on using the independence of the ^" and ^# to solve for T^" ;", ^# ;# and carrying out the integration
15 In the second part we are waiting for < " œ" of the 8œ" Type E outcomes and < # œ" of the =œ" Type Foutcomes By Theorem < < : U " 8= U : < < E 5 5" Z œ E[ œ 5 T^, á, ^ d " # " # œ 5 T^" ;" or ^# ;# d 5" œ 5 T^" ;" T^# ;# T^" ;", ^# ;# d 5" where ^ µ Poisson : and ^ µ Poisson ": The solution in the form " # ;" ;" " # 5x 53" 5x 5" Y : E ": 3œ œ 5 follows directly on integrating The final form follows on using identity (19) of Gould [0]
July 21, 2002, Poisson Randomization (Multinomial Allocation)
July 21, 2002, Poisson Ranomization (Multinomial Allocation) Theorem 1. Poisson Ranomization Theorem Define to be the prouct space á,, â á,, an let be the set of all vectors =, á, = in such that = á =.
More informationBinomial Randomization 3/13/02
Binomial Randomization 3/13/02 Define to be the product space 0,1, á, m â 0,1, á, m and let be the set of all vectors Ðs, á,s Ñ in such that s á s t. Suppose that \, á,\ is a multivariate hypergeometric
More informationRandom partition of a set 2/28/01
Random partition of a set 220 A partition of the set i is a collection of disjoint nonempty sets whose union equals i. e.g. The & partitions of the set i ß#ß$ are ß #ß $ ß # ß $ ß $ ß # #ß$ ß ß # ß $ We
More informationMonty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch
Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,
More informationWith Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationProbabilistic models
Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation
More informationImplicit in the definition of orthogonal arrays is a projection propertyþ
Projection properties Implicit in the definition of orthogonal arrays is a projection propertyþ factor screening effect ( factor) sparsity A design is said to be of projectivity : if in every subset of
More informationProbabilistic models
Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became
More informationLecture 2: Discrete Probability Distributions
Lecture 2: Discrete Probability Distributions IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge February 1st, 2011 Rasmussen (CUED) Lecture
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More informationStat 643 Review of Probability Results (Cressie)
Stat 643 Review of Probability Results (Cressie) Probability Space: ( HTT,, ) H is the set of outcomes T is a 5-algebra; subsets of H T is a probability measure mapping from T onto [0,] Measurable Space:
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationÐ"Ñ + Ð"Ñ, Ð"Ñ +, +, + +, +,,
Handout #11 Confounding: Complete factorial experiments in incomplete blocks Blocking is one of the important principles in experimental design. In this handout we address the issue of designing complete
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationAn Introduction to Combinatorics
Chapter 1 An Introduction to Combinatorics What Is Combinatorics? Combinatorics is the study of how to count things Have you ever counted the number of games teams would play if each team played every
More informationn=0 xn /n!. That is almost what we have here; the difference is that the denominator is (n + 1)! in stead of n!. So we have x n+1 n=0
DISCRETE MATHEMATICS HOMEWORK 8 SOL Undergraduate Course Chukechen Honors College Zhejiang University Fall-Winter 204 HOMEWORK 8 P496 6. Find a closed form for the generating function for the sequence
More informationExample: A Markov Process
Example: A Markov Process Divide the greater metro region into three parts: city such as St. Louis), suburbs to include such areas as Clayton, University City, Richmond Heights, Maplewood, Kirkwood,...)
More informationWhy study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables
ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationDiscrete Mathematics
Discrete Mathematics Workshop Organized by: ACM Unit, ISI Tutorial-1 Date: 05.07.2017 (Q1) Given seven points in a triangle of unit area, prove that three of them form a triangle of area not exceeding
More informationConditional Probability
Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the
More informationThe Bayesian Paradigm
Stat 200 The Bayesian Paradigm Friday March 2nd The Bayesian Paradigm can be seen in some ways as an extra step in the modelling world just as parametric modelling is. We have seen how we could use probabilistic
More informationChapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance
Chapter 7 Chapter Summary 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance Section 7.1 Introduction Probability theory dates back to 1526 when the Italian
More informationDiscrete Distributions
Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have
More informationProbability basics. Probability and Measure Theory. Probability = mathematical analysis
MA 751 Probability basics Probability and Measure Theory 1. Basics of probability 2 aspects of probability: Probability = mathematical analysis Probability = common sense Probability basics A set-up of
More information1 What is the area model for multiplication?
for multiplication represents a lovely way to view the distribution property the real number exhibit. This property is the link between addition and multiplication. 1 1 What is the area model for multiplication?
More information2 Chapter 2: Conditional Probability
STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A
More informationIntroduction to Bayesian Inference
Introduction to Bayesian Inference p. 1/2 Introduction to Bayesian Inference September 15th, 2010 Reading: Hoff Chapter 1-2 Introduction to Bayesian Inference p. 2/2 Probability: Measurement of Uncertainty
More informationHW2 Solutions, for MATH441, STAT461, STAT561, due September 9th
HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences
More informationChapter 4a Probability Models
Chapter 4a Probability Models 4a.2 Probability models for a variable with a finite number of values 297 4a.1 Introduction Chapters 2 and 3 are concerned with data description (descriptive statistics) where
More informationDiscrete Probability
Discrete Probability Counting Permutations Combinations r- Combinations r- Combinations with repetition Allowed Pascal s Formula Binomial Theorem Conditional Probability Baye s Formula Independent Events
More informationLecture 2. Distributions and Random Variables
Lecture 2. Distributions and Random Variables Igor Rychlik Chalmers Department of Mathematical Sciences Probability, Statistics and Risk, MVE300 Chalmers March 2013. Click on red text for extra material.
More informationDISTRIBUTIONS OF NUMBERS OF FAILURES AND SUCCESSES UNTIL THE FIRST CONSECUTIVE k SUCCESSES*
Ann. Inst. Statist. Math. Vol. 46, No. 1, 193 202 (1994) DISTRIBUTIONS OF NUMBERS OF FAILURES AND SUCCESSES UNTIL THE FIRST CONSECUTIVE SUCCESSES* i Department SIGEO AKI 1 AND KATUOMI HIRANO 2 of Mathematical
More informationConditional Probability (cont...) 10/06/2005
Conditional Probability (cont...) 10/06/2005 Independent Events Two events E and F are independent if both E and F have positive probability and if P (E F ) = P (E), and P (F E) = P (F ). 1 Theorem. If
More informationb. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )
1. Set a. b. 2. Definitions a. Random Experiment: An experiment that can result in different outcomes, even though it is performed under the same conditions and in the same manner. b. Sample Space: This
More information{ } all possible outcomes of the procedure. There are 8 ways this procedure can happen.
Probability with the 3-Kids Procedure Statistics Procedures and Events Definition A procedure is something that produces an outcome. When a procedure produces an outcome, it s called a trial or a run of
More informationMCB1007 Introduction to Probability and Statistics. First Midterm. Fall Solutions
İstanbul Kültür University MCB7 Introduction to Probability and Statistics First Midterm Fall 4-5 Solutions Directions You have 9 minutes to complete the eam Please do not leave the eamination room in
More information8œ! This theorem is justified by repeating the process developed for a Taylor polynomial an infinite number of times.
Taylor and Maclaurin Series We can use the same process we used to find a Taylor or Maclaurin polynomial to find a power series for a particular function as long as the function has infinitely many derivatives.
More informationamong children that are administered the Salk vaccine
Final Review problems (solutions available): 6.9, 6.12, 6.20, 6.27, 6.29, 6.33, 6.73 Final Review problems (solutions not available yet): 10.18, 10.34, 10.35, 10.46 Chapter 10 Categorical Data In this
More information(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.
54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and
More informationSIMULATION - PROBLEM SET 1
SIMULATION - PROBLEM SET 1 " if! Ÿ B Ÿ 1. The random variable X has probability density function 0ÐBÑ œ " $ if Ÿ B Ÿ.! otherwise Using the inverse transform method of simulation, find the random observation
More informationProbability. Chapter 1 Probability. A Simple Example. Sample Space and Probability. Sample Space and Event. Sample Space (Two Dice) Probability
Probability Chapter 1 Probability 1.1 asic Concepts researcher claims that 10% of a large population have disease H. random sample of 100 people is taken from this population and examined. If 20 people
More informationPattern Recognition and Machine Learning. Learning and Evaluation of Pattern Recognition Processes
Pattern Recognition and Machine Learning James L. Crowley ENSIMAG 3 - MMIS Fall Semester 2016 Lesson 1 5 October 2016 Learning and Evaluation of Pattern Recognition Processes Outline Notation...2 1. The
More informationStatistical machine learning and kernel methods. Primary references: John Shawe-Taylor and Nello Cristianini, Kernel Methods for Pattern Analysis
Part 5 (MA 751) Statistical machine learning and kernel methods Primary references: John Shawe-Taylor and Nello Cristianini, Kernel Methods for Pattern Analysis Christopher Burges, A tutorial on support
More information7. Higher Dimensional Poisson Process
1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14 The Poisson Process > 1 2 3 4 5 6 7 7 Higher Dimensional Poisson Process The Process The Poisson process can be defined in higher dimensions, as a model
More informationMath 493 Final Exam December 01
Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet
More informationMachine Learning
Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting
More informationNotes for Math 324, Part 17
126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and
More informationCH 3 P1. Two fair dice are rolled. What is the conditional probability that at least one lands on 6 given that the dice land on different numbers?
CH 3 P1. Two fair dice are rolled. What is the conditional probability that at least one lands on 6 given that the dice land on different numbers? P7. The king comes from a family of 2 children. what is
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04
More informationChapter 5 Joint Probability Distributions
Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two
More informationHYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS
HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS A The Hypergeometric Situation: Sampling without Replacement In the section on Bernoulli trials [top of page 3 of those notes], it was indicated
More informationProbability & Statistics - FALL 2008 FINAL EXAM
550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is
More informationConstructive Decision Theory
Constructive Decision Theory Joe Halpern Cornell University Joint work with Larry Blume and David Easley Economics Cornell Constructive Decision Theory p. 1/2 Savage s Approach Savage s approach to decision
More informationWork easier problems first. Write out your plan for harder problems before beginning them.
Notes:??? Form R 1 of 7 Directions: For multiple choice (A, B, C, etc.), circle the letter of the best choice. For TRUE/FALSE, circle the word TRUE or the word FALSE. When a list of choices is given for
More informationLecture 1: Brief Review on Stochastic Processes
Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.
More informationCS Data Structures and Algorithm Analysis
CS 483 - Data Structures and Algorithm Analysis Lecture VII: Chapter 6, part 2 R. Paul Wiegand George Mason University, Department of Computer Science March 22, 2006 Outline 1 Balanced Trees 2 Heaps &
More informationUnit II. Page 1 of 12
Unit II (1) Basic Terminology: (i) Exhaustive Events: A set of events is said to be exhaustive, if it includes all the possible events. For example, in tossing a coin there are two exhaustive cases either
More informationCSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions
CSE 31: Foundations of Computing II Random Variables, Linearity of Expectation Solutions Review of Main Concepts (a Random Variable (rv: A numeric function X : Ω R of the outcome. (b Range/Support: The
More informationLectures on Elementary Probability. William G. Faris
Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................
More informationChapter 2: The Random Variable
Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome
More informationCrash Course in Statistics for Neuroscience Center Zurich University of Zurich
Crash Course in Statistics for Neuroscience Center Zurich University of Zurich Dr. C.J. Luchsinger 1 Probability Nice to have read: Chapters 1, 2 and 3 in Stahel or Chapters 1 and 2 in Cartoon Guide Further
More informationBaye s theorem. Baye s Theorem Let E and F be two possible events of an experiment, then P (F ) P (E F ) P (F ) P (E F ) + P (F ) P (E F ).
Baye s Theorem Assume that you know the probability that a child will be born with blond hair given that both his parents have blond hair. You might also be interested in knowing the probability that a
More informationSVM example: cancer classification Support Vector Machines
SVM example: cancer classification Support Vector Machines 1. Cancer genomics: TCGA The cancer genome atlas (TCGA) will provide high-quality cancer data for large scale analysis by many groups: SVM example:
More informationBINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called
More informationMachine Learning
Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,
More information1. (Rao example 11.15) A study measures oxygen demand (y) (on a log scale) and five explanatory variables (see below). Data are available as
ST 51, Summer, Dr. Jason A. Osborne Homework assignment # - Solutions 1. (Rao example 11.15) A study measures oxygen demand (y) (on a log scale) and five explanatory variables (see below). Data are available
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationPreliminary statistics
1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),
More informationRedoing the Foundations of Decision Theory
Redoing the Foundations of Decision Theory Joe Halpern Cornell University Joint work with Larry Blume and David Easley Economics Cornell Redoing the Foundations of Decision Theory p. 1/21 Decision Making:
More informationSTAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS
STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,
More informationEDRP lecture 7. Poisson process. Pawe J. Szab owski
EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment
More information107 Exercises in Probability Theory
UNIVERSITY OF KENT Institute of Mathematics, Statistics and Actuarial Science Module MA304 DISCRETE MATHEMATICS AND PROBABILITY 107 Exercises in Probability Theory 1 2 1. Suppose that the sample space
More information{ 0! = 1 n! = n(n 1)!, n 1. n! =
Summations Question? What is the sum of the first 100 positive integers? Counting Question? In how many ways may the first three horses in a 10 horse race finish? Multiplication Principle: If an event
More informationProbability Distributions
02/07/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 05 Probability Distributions Road Map The Gausssian Describing Distributions Expectation Value Variance Basic Distributions Generating Random
More informationProbability, Random Processes and Inference
INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationAn investigation into the use of maximum posterior probability estimates for assessing the accuracy of large maps
An investigation into the use of maximum posterior probability estimates for assessing the accuracy of large maps Brian Steele 1 and Dave Patterson2 Dept. of Mathematical Sciences University of Montana
More informationSolution: There are 30 choices for the first person to leave, 29 for the second, etc. Thus this exodus can occur in. = P (30, 8) ways.
Math-2320 Assignment 7 Solutions Problem 1: (Section 7.1 Exercise 4) There are 30 people in a class learning about permutations. One after another, eight people gradually slip out the back door. In how
More informationSo far we discussed random number generators that need to have the maximum length period.
So far we discussed random number generators that need to have the maximum length period. Even the increment series has the maximum length period yet it is by no means random How do we decide if a generator
More informationProbability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,
Chapter 8 Exercises Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin, morin@physics.harvard.edu 8.1 Chapter 1 Section 1.2: Permutations 1. Assigning seats *
More informationThe Hahn-Banach and Radon-Nikodym Theorems
The Hahn-Banach and Radon-Nikodym Theorems 1. Signed measures Definition 1. Given a set Hand a 5-field of sets Y, we define a set function. À Y Ä to be a signed measure if it has all the properties of
More informationCSU FRESNO MATH PROBLEM SOLVING. Part 1: Counting & Probability
CSU FRESNO MATH PROBLEM SOLVING February 8, 009 Part 1: Counting & Probability Counting: products and powers (multiply the number of choices idea) 1. (MH 11-1 008) Suppose we draw 100 horizontal lines
More informationECE302 Spring 2015 HW10 Solutions May 3,
ECE32 Spring 25 HW Solutions May 3, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationRandom Variables Example:
Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the
More information2.6 Tools for Counting sample points
2.6 Tools for Counting sample points When the number of simple events in S is too large, manual enumeration of every sample point in S is tedious or even impossible. (Example) If S contains N equiprobable
More informationLecture 2 Binomial and Poisson Probability Distributions
Binomial Probability Distribution Lecture 2 Binomial and Poisson Probability Distributions Consider a situation where there are only two possible outcomes (a Bernoulli trial) Example: flipping a coin James
More informationMath 710 Homework 1. Austin Mohr September 2, 2010
Math 710 Homework 1 Austin Mohr September 2, 2010 1 For the following random experiments, describe the sample space Ω For each experiment, describe also two subsets (events) that might be of interest,
More informationDiscussion 03 Solutions
STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they
More information1 Combinatorial Analysis
ECE316 Notes-Winter 217: A. K. Khandani 1 1 Combinatorial Analysis 1.1 Introduction This chapter deals with finding effective methods for counting the number of ways that things can occur. In fact, many
More informationBiased Urn Theory. Agner Fog October 4, 2007
Biased Urn Theory Agner Fog October 4, 2007 1 Introduction Two different probability distributions are both known in the literature as the noncentral hypergeometric distribution. These two distributions
More informationω X(ω) Y (ω) hhh 3 1 hht 2 1 hth 2 1 htt 1 1 thh 2 2 tht 1 2 tth 1 3 ttt 0 none
3 D I S C R E T E R A N D O M VA R I A B L E S In the previous chapter many different distributions were developed out of Bernoulli trials. In that chapter we proceeded by creating new sample spaces for
More informationUnivariate Discrete Distributions
Univariate Discrete Distributions Second Edition NORMAN L. JOHNSON University of North Carolina Chapel Hill, North Carolina SAMUEL KOTZ University of Maryland College Park, Maryland ADRIENNE W. KEMP University
More informationRelationship between probability set function and random variable - 2 -
2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be
More informationCS206 Review Sheet 3 October 24, 2018
CS206 Review Sheet 3 October 24, 2018 After ourintense focusoncounting, wecontinue withthestudyofsomemoreofthebasic notions from Probability (though counting will remain in our thoughts). An important
More informationPart 2: One-parameter models
Part 2: One-parameter models 1 Bernoulli/binomial models Return to iid Y 1,...,Y n Bin(1, ). The sampling model/likelihood is p(y 1,...,y n ) = P y i (1 ) n P y i When combined with a prior p( ), Bayes
More informationDiscrete Structures for Computer Science
Discrete Structures for Computer Science William Garrison bill@cs.pitt.edu 6311 Sennott Square Lecture #24: Probability Theory Based on materials developed by Dr. Adam Lee Not all events are equally likely
More information3. The Multivariate Hypergeometric Distribution
1 of 6 7/16/2009 6:47 AM Virtual Laboratories > 12. Finite Sampling Models > 1 2 3 4 5 6 7 8 9 3. The Multivariate Hypergeometric Distribution Basic Theory As in the basic sampling model, we start with
More informationChapter 8 Sequences, Series, and Probability
Chapter 8 Sequences, Series, and Probability Overview 8.1 Sequences and Series 8.2 Arithmetic Sequences and Partial Sums 8.3 Geometric Sequences and Partial Sums 8.5 The Binomial Theorem 8.6 Counting Principles
More informationChapters 3.2 Discrete distributions
Chapters 3.2 Discrete distributions In this section we study several discrete distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more. For
More information