Two characteristic properties of a random experiment:

Size: px
Start display at page:

Download "Two characteristic properties of a random experiment:"

Transcription

1 1 Lecture Probability as Relative Frequency A single coin flipping experiment. Observe a coin flipped many times. What probability we can prescribe to the event that a head turns up? We expect the coin to be fair, we expect a head turn up approximately half of the times the coin if flipped. Probability of a head = for large number of experiments. #of times a head turns up # of coin flips 1 2, Probability of an event = Relative Frequency of the event. Two characteristic properties of a random experiment: 1. We can not predict the outcome of experiment. 2. We can estimate relative frequencies of the outcomes A box contains n balls; m are red, n m are black. A ball is drawn at random. What is the probability that the ball is red? The ball is black? The ball is either red or black? Two cons are flipped. Write a model of this experiment. Find the probability that there is at least on head on two coins. 1.2 Mathematical Model of Probability Outcomes of a random experiment are observed. The set of all possible outcomes is called Sample space: S. Outcomes are labeled by small letters a,b,c... In this notation S = {a,b,c...} An event is a subset of S. Events are denoted by capital letters A,B,C,D,E... A set containing no elements is call a null set: /0. An event A is a subset of event B is every element of A is also an element of B : A B, If x A then x B. 1

2 1. Operation on events. 2. Venn Diagrams. 3. Properties of set Operations. De Morgan s Laws. 4. Proving the Properties of set operations. 5. Union and intersection of several sets. 6. Disjoint sets. 7. Property that for any two sets A,B : A = (AB) (AB), with two disjoint sets AB and AB. Problem Prove the De Morgan s Law n i=1 A i = n i=1 A i. 2

3 2 Lecture Axioms of Probability The probability is a function from the set of events into the interval [0,1], that verifies the following properties: 1. P(A) 0, for any event A. 2. P(S) = 1; 3. For any pairwise disjoint events A 1,A 2,A 3,...,A n : Properties of the probabilities. Problem Show that P(/0) = 0. P(A 1... A n ) = P(A 1 ) P(A n ). Solution Since S /0 = S, and S and /0 are disjoint (why?) we can use axiom 3 to get which implies that P(/0) = 0. 1 = P(S) = P(S) + P(/0) = 1 + P(/0), Problem Show that for any event A, P(A) = 1 P(A), P(A) = 1 P(A). Solution Using the set identity S = A A, and axiom 3 we get 1 = P(S) = P(A) + P(A), from which the formulas follow. Problem Show that for any event A, 0 P(A) 1. 3

4 Solution The first inequality P(A) 0 is the statement of axiom 1. From the last problem we get P(A) = 1 P(A) 1. Problem Show that Hint: Use the fact that P(A B) = P(A) + P(B) P(AB). A = AB AB, B = BA AB, A B = AB BA AB. Solution Using the hint and axiom 3 we get P(A) = P(AB) + P(AB) P(B) = P(BA) + P(AB) P(A B) = P(AB) + P(BA) + P(AB). Subtracting the sum of the first two equations from the last we get the formula. Problem Show that P(A B C) = P(A) + P(B) + P(C) P(AB) P(AC) P(BC) + P(ABC). Hint: Use the formula from the previous problem. Solution Introduce set D = A B. Using the previous formula P(D C) = P(D) + P(C) P(DC) = P(A B) + P(C) P((A B)C) = P(A) + P(B) P(AB) + P(C) P(AC BC) = P(A) + P(B) P(AB) + P(C) (P(AC) + P(BC) P(ABC)) = P(A) + P(B) + P(C) P(AB) P(AC) P(BC) + P(ABC). 4

5 2.2 Probability Space The sample space S, the set of events F, and the probability P(A) that verifies the axioms of probability is called a Probability Space: Examples of Probability Spaces. (S,F,P). 1. The classical probability: S = {a 1,...,a n } and all outcomes are equally likely. The probability to get an outcome P(a i ) = 1 n, i = 1..n. If an event A contains k outcomes, then P(A) = k n. 2. The area function. S is a square (or any other figure) of area 1. Events are the subsets of S. P(A) = Area(A). Show that P(A) is a valid probability function. 3. Empirical Probability. An experiment with n outcomes {a 1,...,a n } is repeated N times. Define P(a i ) = # of times a i occurs. N 4. Belief Probability. Betting on sports. The odds are computed as a fraction of all money people bet on a particular team to all money against the team. The corresponding probabilities are defined are ratios of all money people bet on a team to the total bets. An example is given in the table below. 5

6 Team Odds Probability of winning Team 1 3:17 Team 2 1:1 Team 3 7: Note that all probabilities sum up to 1. The odds let you quickly compute your return if a team wins. For example if you bet $N for team 1 and it wins, you return (the money you get in the excess of your bet) is $ 17 3 N. Note, that according to the table people rate team 1 lower than other teams. 3 Lecture Problems involving proportions 1. In a group of 10,000 people, 1,000 have home insurance; 2,000 have health insurance; 300 have both home and health insurance. Find probability that a randomly selected person has at least one type of insurance. Solve the problem in two ways: counting the number of people in the event in question and using the formula for the probability of union of two events. Answer: P = In a group of 100 students, 50 take Math, 60 take Chem, and 10 do not take Math. and do not take Chem. Find the probability that a random student takes Math and Chem. Answer: P =.2. 6

7 3. In a small town 25% of population have Life insurance; 10% have home insurance; 20% have health insurance; 5% have home and life insurance; 10% have health and life insurance; 3% have home and health insurance; 1% have all 3 types of insurance. Find the probability that a randmly selected person has at least one type of insurance. Solve the problem in two ways: using Venn diagram and the formula for the union of three events. Answer: P = A group of people is polled on 3 questions. 55% said Yes on Question 3; 32.5% said Yes exactly twice; 10% said Yes on all three questions; 12.5% said Yes on Question 1 and Question 2. Find the probability that a randomly selected person answered No, No, Yes on Question 1, Question 2 and Question 3. Hint: Draw the Venn diagram and denote the unknown probabilities by letters. Set up and solve the system of equations for unknown probabilities. Answer: P = Classical Probability Recall the definition of the classical probability from Lecture 2: we have an experiment with finitely many outcomes, that are equally likely. For an event A, P(A) = # of outcomes in A # of outcomes in S. 1. A well balanced die is rolled 3 times. Find: (a) the probability that the same number appears 3 times; (b) the probability that the last two numbers are the same; (c) the probability that the same number appears at least twice. 7

8 Answers: 6/216 = 1/36; 36/216 = 1/6; 1/6. The number of ways to select a combination k objects from n objects equals ( ) n n! = k k!(n k)!. When we select a combination we don t return the selected object to the original set, and we don t pay attention to the order in which objects are selected. 2. Lottery Problem. In the Texas powerball lottery a combination of 5 numbers in the range chosen at random. The lottery pays $1,000,000 to everyone who correctly guesses 5 numbers numbers (no repetitions allowed). It pays out $7 to anyone who guess 3 numbers. Find the probability of the corresponding events. ( ) 69 1 P( 5 numbers correct) = ( 5 64 ) P( exactly 3 numbers match) = 3)( 2 ) ( 69 5 Find the probability that at least three numbers match. ( 5 66 ) P( at least 3 numbers correct) = 3)( 2 ) The lottery ticket costs $2. Is it worth playing this game repeatedly? We ll get back to this question when we introduce the expectation of a random variable. 3. Committee problem. A committee of 5 people is selected at random from a pool of 15 women and 10 men. Find the probability that the committee contains all men. Find the probability that 2 men and 3 women are selected. Sample Space: all combinations of 5 people out of 25. #S = ( 25 5 ). ( 69 5 Let A be the event that only men are selected. #A = ( 10) 5. ( ) ( ) P(A) = /. 5 5 Let A be the event that 3 women and 2 men are selected. #A = ( 15 3 ( 25 P(A) = ( 15 3 )( 10 2 ) / 5 ). )( ) 8

9 Problem Find a flaw in the solution to the following problem. A box contains 5 black and 5 white balls. Two balls are drawn at random. Lets count the probability that at least two balls are white. The total number of combinations is ( 10) 2. There are ) ways to select one white ball. Since the remaining ball can be of any color, ( 5 1 there are ( 9 1) ways to select the remaining ball. So, The correct answer: P(at least one white) = P(at least one white) = ( 5 )( 9 1 ( 1) 11 ). 2 ( 5 )( 5 ) ( ( 2) 11 ) Card problem. From a standard deck of cards (52=13*4) 6 cards are chosen at random. Find the probability that there is only one ace in a hand. Find the probability that there are exactly two aces in a hand. #S = ( 52 6 ). Let A the event that a hand of six cards has one ace. A = {all combinations of 6 cards with one ace}. #A = ( 4)( 48 ) 1 5. P(A) = ( 4 1 )( 48 5 ) / ( 52 6 ). Let A the event that a hand of six cards has two aces. #A = ( 4)( 48 ) 2 4. )( ( P(A) = ( ) / 6 ). 5. The key problem. In a set of 10 keys only one opens a door. You choose 7 keys at random. What is the probability that the right key is among selected. S = { combinations of 7 keys out of 20}. #S = ( 10) 7. Let A be the event that the right key is among 7 selected. A = {all combinations of 7 keys from 10 that include the right key}. #A = ( 9 6). P(A) = ( ) 9 / 6 ( 10 7 ) = 7/10. 9

10 6. Sock problem. A drawer contains 10 pairs of socks of different colors. You select 8 socks at random. Find the probability that you don t have a matching pair. #S = ( 20) 8. Let A be the set of all combinations of socks of different colors. ( ) 10 #A = 2 8 ; 8 first select 8 different colors out of 10. Each color can be represented by 1 out of two socks of that color. ( ) ( ) P(A) = 2 8 / Another Sock problem A drawer contains 20 pairs of socks of two colors. What is the minimum number of socks you have to draw to get at least one complete pair with probability Selecting objects with replacement.3 contracts are assigned at random to 4 firms (a firm might get multiple contracts). Find the probability that all contracts go to a single firm. S = {(a 1,a 2,a 3 ), a i { f irm1, f irm2, f irm3, f irm4}}, assignment of firms to contracts 1 3. #S = 4 3. P(A) = ( ) 4 /4 3 = 1/16. 1 (This is also a slot machine problem: 3 objects are selected from 4 with replacement. You win when all objects are the same.) When choosing a combination of k objects, we do not pay attention to the order in which the objects a chosen. That is, we can select a combination in many different ways. Sometimes it is important to now what objects we select first, second and so on. The number of ways to select k objects from a set of n objects when objects are distinguished by the order in which they are selected equals ( ) n k! = n! k (n k)!. 9. Birthday problem. Find the probability that at least two people in a group of r people have birthdays on the same day. Assume that every year has 365 days. Answer: P = 1 365! 1. For r = 80, P.99. (365 r)! 365r 10

11 Yet another way to make a selection is to select a combination from the set of n objects, when duplicates are allowed: you can think of this as selecting an object, remembering what you ve selected, returning the object back to the set, and repeating the process k 1 times. This is called selecting with the replacement. Problem (this requires some thinking) Show that the number of combinations when selecting with replacement equals: ( ) n + k 1. n A Variation of a lottery problem. In a certain lottery 8 numbers are selected from the set of numbers Duplicates are allowed. Find the probability to win the lottery. The probability is one over the total number of combinations. In this case: ( ) 47 P = 1/ Lecture Conditional Probability A survey of 10,000 people in a small town shows the following information Men Women Total Smokers 1, ,500 Non-smokers 3,000 5,500 8,500 Total 4,000 6,000 10,000 A person is chosen at random. Let S be the event that the person is a smoker, N non-smoker, M male. P(S) = 3/20, P(N) = 17/20, P(M) = 2/5. P(MS) = 1/10, P(M S) = 9/20. Find the probability that the person is a smoker if we know that the person is male? P(S M) = 1/4. Find the probability that the person is female if the person is non-smoker? P(W N) = 11/17. The last two examples are conditional probabilities. In general, P(A B) = # of outcomes in AB # of outcomes in B, 11

12 or P(A B) = P(AB) P(B). Sometimes it is convenient to use the conditional probability to determine P(AB) : P(AB) = P(A B)P(B). Problem What is more likely for a man to be smoker or for a woman to be a smoker? Hint Compare P(S M) and P(S W). Problem If we pick a smoker, is this person more likely to be a man or woman? Hint Compare P(M S) and P(W S). Problem A die is rolled. If the die shows an even number, what is the probability of getting 4? What is the probability of getting an even number, if the die shows 4. Solution 1/3, 1 Problem Show that for a fixed B F, P(A B), as a function of A, is a probability function on S. Problem You have 3 boxes. Box X has 10 light bulbs of which 4 are defective, box Y has 6 light bulbs of which 1 is defective, box Z has 8 light bulbs of which 3 are defective. A box is chosen at random and then a bulb is randomly selected from this box. Find the probability that the bulb is not defective. If the bulb is not-defective, find the probability it came from box Z. Let N,D be the events that the bulb is not defective and defective. Let X,Y,Z be the events that box X,Y,Z is selected. Draw the tree diagram and label the branches with correct probabilities. P(N) = P(N(X Y Z)) = P(NX NY NZ) = P(NX) + P(NY ) + P(NZ). P(NX) = 1/3 3/5 = 1/5, P(NY ) = 5/6 1/3 = 5/18, P(NZ) = 5/8 1/3 = 5/24. P(N) = 1/3 + 5/6 + 5/24. The second probability P(Z N) = P(ZN) P(N) =

13 Problem How to ask an uncomfortable question. You want to poll people on a question that they may feel uncomfortable to given an answer. To deal with that the statisticians came up with the following solution. You tell the person that you want to poll to flip a coin, but don t show it to you. If the coin shows a head the person should answer Yes; if the coin shows a tail the person should provide a correct answer. From polling a large number of people you compute p e the proportion of people who said Yes in the experiment. Find the proportion of people would actually answered Yes. Draw the tree diagram. Two events A,B are independent if If events are independent p e = p + (1 p)1/2, p = 2p e 1. P(AB) = P(A)P(B). P(A B) = P(A), P(B A) = P(B). Problem A box contains 5 black and 6 white balls. Two balls are drawn with replacement. Let A be the event that first ball is black, and B the event that second ball is white. Compute P(A), P(B), P(AB), P(B A), P(A B). Solution P(A) = 5/11, P(B) = 6/11, P(AB) = 30/121, P(B A) = P(B), P(A B) = P(A). Problem Repeat the problem for the selection without replacement. P(A) = 5/11, P(B) = 6/11, as before, but P(AB) = 3/10. In this situation 5 Lecture The Law of Total Probability. Let H i, i = 1..k, be pairwise disjoint events such that k i=1 H i = S. The Law of Total Probability states that for every event A, Problem Proof this formula. P(A) = k i=1 P(A H i )P(H i ). 13

14 Problem An insurance company divides new customers into high and low risk groups. From statistical study the company finds that 1 out of 3 people in the high risk group will have an accident in 1 year period, and 1 out of 10 in low-risk group will have an accident in a 1-year period. If the high-risk group is 30% of all potential customers, what is the probability that a new policyholder will have an accident within one year? Solution P(A) = Bayes Theorem. Let H i, i = 1..k, be pairwise disjoint events such that For every event A, with P(A) > 0, This is called Bayes formula. k i=1 H i = S. P(H j A) = P(A H j)p(h j ) k i=1 P(A H i)p(h i ). Problem In the conditions of the previous problem, suppose a new policyholder has an accident in the first year. What is the probability that he is in high-risk group? Solution P(H 1 A) =.59 Problem The problem above estimates the chance for a customer to be in high-risk group. Now the Law of Total Probability can be used again to re-evaluate the chance that a customer will have an accident in one year. Compute this probability. Problem A laboratory blood test is 95% effective in detecting a certain disease. The test in false positive for 1 out of 100 healthy person. If.5% of the population has the disease, what is the probability a person has the disease, given that the result is positive? Solution P =.323. If H,D,T are the events that a random person is healthy, ill, and the test is positive then, 1 P(D T ) =. P(T H)P(H) 1 + P(T D)P(D) 14

15 The relatively low probability is explained by the fact that the disease is rare: there is P(D) = probability that the person has the disease, to start with. In this situation, the positive blood test is more likely to be false positive. 6 Lecture Random Variables Let X(x) be a function from set A to set B. The set {X = b,} for b B, denotes the set of points a A such that X(a) = b. The set {X < b,} for b B, denotes the set of points a A such that X(a) < b. In general, the set {b 1 < X < b 2,} for b 1,b 2 B, denotes the set of points a A such that b 1 < X(a) < b 2. For function X(x) = (x 2) 2 from R to R determine the sets {X = 3}, {X < 2}, {X 100} and {1 X < 4}. For function X(x) from set A = {a,b,c,d,e} to set B = { 3,0,1,2} given by the table x X(x) a 2 b -3 c 2 d 0 e 1 determine the sets {X = 2}, {X < 2}, {X 100} and {1 X < 3}. Consider a game: two coins are tossed. You earn $1 for each head; You lose $3 for two tails. 15

16 Let X be the amount you get (win/lose) after one round of this game. Describe the statistical properties of X. Sample space Table of values of X : S = {HT, T H, HH, T T }. P(HT ) =... = P(T T ) = 1/4. Outcome X(a) HH 2 HT 1 TH 1 TT -3 Events: {X = 3} = {T T }, {X = 1} = {HT, T H}, {X = 2} = {HH}. Probabilities: P(X = 3) = 1/4, P(X = 1) = 1/2, P(X = 2) = 1/4. Find the probability of the event {X > 0}. {X > 0} = {T H, HT, HH}, P(X > 0) = 3/4. Definition 1. A function from the set of outcomes into real numbers is called a random variable. If a random variable takes at most countably many values, then it is called discrete random variable. Definition 2 (Probability Mass Function). The probability mass function of a discrete random variable X is a function from set of values of X into real numbers such that p X (x) = P(X = x), for any value x of X. 16

17 Typically p X is given by the table of its values. For the example above, the table of the probability mass function: x p X 1/4 1/2 1/4 p X can also be represented by a Bar Graph. A box contains 10 bulbs of which 3 are defective. 4 bulbs are selected at random without replacement. Let X be the number of defective bulbs in the sample. Write down the table of the probability mass function p X. Properties of p X : For any value x of X : 0 p X (x) 1, if x 1,x 2,... is the list of values of X, then p X (x i ) = 1. i Definition 3. The Cumulative Distribution Function, (or CDF), is defined for all values x as F X (x) = P(X x). Find the CDF for X from the previous problem. Properties of CDFs. for any x, 0 F X (x) 1; lim F X(x) = 0; x lim F X(x) = 1; x + F X (x) is continuous from the right: F X (x) = lim y x+ F X(y). For a function F(x) = 0 x < x < /2 x < x < x

18 verify that F is a valid CDF. Find PMF for F. Find the probability P(X < 3), if the random variable X has F as its CDF. Find P(X 0.5). 6.2 Notion of Expectation For the game described at the begging of Lecture 7, estimate Averaged Accumulated Wealth after N rounds of the game. After N turns of the game we expect to lose $3 in N/4 games; win $1 in N/2 games; win $2 in N/4 games. Total accumulated wealth: ( 3)N/4 + N/4 + 2N/4 = N/4. The Averaged Accumulated Wealth per game equals 1/4. This number is called the expectation of X : E[X] = x i P(X = x i ) = x i p X (x i ). i i 1. Key problem. A man has N keys on a key chain. One key unlocks the door. A man randomly selects a key and tries it. If the key doesn t open the door he puts it aside and select another key from the set of the remaining keys. Let X be the number of tries needed to open the door. Find PMF of X and the expectation E[X]. Answer: E[X] = (N + 1)/2. Hint: P(X = k) = 1/N. This is statistically equivalent to the probability to put the correct key to position k, among N positions. It can also be computed using the conditional probability. 18

19 2. Solve the problem under the condition that the man always returns a key back to the chain. Answer: E[X] = k=1 (k/n)(1 1/N)k 1 = N. Hint: use independence of outcomes of successive tries. Properties of the Expectation If X a is constant, E[X] = E[a] = a; E[aX + b] = ae[x] + b; Problem Let X be a random variable with PMF: E[g(X)] = g(x i )p X (x i ). i Problem Find E[X 2 ]. x p X 1/4 1/8 1/2 1/8 Problem Find the probability mass function of X Variance Compare two games. Game 1: Win $1 for each head; Lose $3 for getting two tails. Game 2: 19

20 Win $3 for getting two heads; Win $1 for one of each; Lose $4 for getting two tails. Denote by X 1,X 2 the amount you win(lose) when playing Game 1 and Game 2, respectively. Compare E[X 1 ], E[X 2 ] and the Bar graphs of p X1 and p X2. 20

21 7 Review problems for Test 1 1. For any sets A,B show that AB A A B. 2. Show that if A B then B A. 3. Show that A = AB AB. 4. Show that A b = B BA. 5. Show that ( n i=1 A i)b = n i=1 (A ib). 6. List axioms of the Probability. 7. Show that P(A) = 1 P(A), for any event A. 8. Show that if A B, then P(A) P(B). 9. Prove that for any sets A,B : 10. Prove that for any sets A,B : P(A B) = P(A) + P(B) P(AB). P(AB) = P(A) P(AB) students are polled to see whether or not they have studied French or German. The poll shows that 25 studied French, 20 studied German, and 5 studied both. Find the probability that a randomly selected student: (a) Studied only French; (b) Did not study German; (c) Studied French or German; (d) Studied neither language. Write each events above using the set notation F for the set of student who studied French, G the set of students who studied German and set operations. 12. Review problems involving three kinds of sets. 13. Review the following problems: Committee problem, Card problem, Key problem, Birthday problem. 21

22 14. In a community, 36% of families own a dog; 22% of the families that own a dog, also own a can; 30% of the families own a cat. Find the probability that a randomly selected family owns a dog and a cat. Find the probability that a family that owns a cat also owns a dog. Let X be a random variable. Definition 4. The variance of X is defined as V (X) = E[(X E[X]) 2 ], and standard deviation σ = E[(X E[X]) 2 ]. Problem Compute V (X 1 ) and V (X 2 ) from the coin tossing experiment. Problem Grade distribution in class. Problem Let X be a discrete random variable with mean µ = E[X]. Suppose that the variance V (X) = 0. What is the prob. mass function of X? Properties of Variance V (ax + b) = a 2 V (X); V (X) = E[X 2 ] (E[X]) 2. Tchebyshev s Inequality 22

23 Theorem 1. Let X be a discrete random variable with mean µ and standard deviation σ. For any k > 0, P( X µ < kσ) 1 1 k 2, or, equivalently P( X µ kσ) 1 k 2. Proof. Let X be a random variable with PMF: Then, x a 1 a 2... a n p X p 1 p 2... p n σ 2 = E[(X µ) 2 ] = (a i µ) 2 p i i = i: a i µ <kσ (a i µ) 2 p i + i: a i µ kσ (a i µ) 2 p i. It follows that But Thus we showed that k 2 σ 2 i: a i µ kσ p i σ 2. p i = P( X µ kσ). i: a i µ kσ P( X µ kσ) 1 k 2. Problem A factory produces on average 120 items per week, with standard deviation σ = 10. Estimate the probability for the weekly production to be between 100 and 140 items. Find the shortest interval certain to contain at leas 90% of the weekly production level. Answer: P = 3/4. Interval [ , ]. 8 Lecture Binomial B(n, p). Special Discrete Distributions 23

24 2. Geometric Geom(p). 3. Poisson Pois(λ). 4. Negative Binomial NB(r, p). For each of the distribution we will be interested in the description of random experiments leading to these distributions, and the statistical properties: the mean E[X], the variance V (X). 8.1 Binomial Distribution Definition 5. n Bernoulli Trials: an experiment with only two outcomes {S, F} is independently repeated n times. We set p = P(S), 1 p = P(F). Definition 6. Let X be a random variable that counts the number of successes, S, in n Bernoulli Trials. X is called the Binomial random variable B(n, p). Sample Space: all sequences of n letters S or F. S = {(SSSSFFFSFSFSFFF)} P(SSSSFFFSFSFSFFF) = p 7 (1 p) 8. Let X be the number of S s in a sequence. ( ) n P(X = k) = p k (1 p) n k, k k = 0,1,..,n. Problem Verify that P(X = k) is a valid probability mass functions. Problem The probability that a student answers each question correctly on a test is 1/4 and is independent of questions. If a test has 10 problems what is the probability that the student answers at least 3 correctly? Problem An automotive parts manufacturer produced transmission for a certain vehicle. There is 2% chance that any given transmission is defective. Determine that more than 2 transmissions in a box of 100 are defective. Problem Auto insurance is analyzing the claim frequency on a block of 250 policies. Historical data suggest that 10% of the policyholders will submit at least one claim in the coming year. What is the probability that more than 12% of the policyholders will submit at least one claim in the coming year. 24

25 Problem Show that E[X] = np. Problem A current price of a particular stock is $10. On any particular day the stock price either increases by $2, with the probability 60%, or decreases by $3, with the probability 40%. Determine the expected value of the stock price in 4 days from now. Problem Show that V (X) = np(1 p). 8.2 Geometric Distribution Definition 7. In a repeated Bernoulli Trials, let X counts the number of F s before the first S. X is called the geometric random variable, Geom(p). Let X be geometric random variable. P(X = 0) = p, P(X = 1) = (1 p)p,...,p(x = k) = (1 p) k p, k = 0,1,2,... Problem Verify that P(X = k) is a valid probability mass functions. Problem Show that E[X] = 1 p p, V (X) = 1 p p 2. Problem The probability to win a jackpot on a slot machine is 1/16. Find the probability that 4 or more plays are needed to win a jackpot. 8.3 Poisson Distribution An insurance company estimates that there are 150 claims are filed on average per year. What is the probability that a company gets exactly k claims? There can be many answers to it. It might be that every year the company gets exactly 150 claims. In this case P(150 claims) = 1. Or it might get 100 claims one year and 200 claims the following year, in which case P(100 claims) = 1/2, P(200 claims) = 1/2, P(150 claims) = 0. Other arrangements are possible, leading to different answers. There is not enough information to answer the question. We will build a reasonable model based on the following assumptions. 25

26 1. Let us divide one year into small intervals I k, k = 1..n, so small that we re almost certain that no more than one claim can be submitted during each interval I k. 2. Let us assume that the event that a claim is filed during interval I j is independent from the event that a claim is filed during interval I k, for any j k. 3. We estimate from the conditions of the problem that the probability that a claim is filed during interval I k is p = 150/n. Under these assumptions we re dealing with n Bernoulli Trials with success probability p. Thus, the number of claims, X is B(n, p). P(X = k) = ( ) n p k (1 p) n k. k Take the limit n in the above expression and show that it converges to P(X = k) = 150k e 150, k = 0,1,2,... k! Definition 8. We say that X is a Poisson random variable with parameter 150, Pois(150). Problem Verify that P(X = k) is a valid probability mass functions. Problem Let X be Pois(λ). Compute E[X], and V (X) : E[X] = λ, V (X) = λ. Problem Calls to a telephone hot line service are made randomly and independently at the expected rate of 2 per minute. Find the probability that the service recieves fewer than 5 calls in the next minute. P(X < 5) = k e 2 = k! Problem A book of 500 pages contains 300 typos that are distributed randomly. Find the probability that a given page contains: a)exactly two misprints; b) 2 or more misprints. Problem The Poisson Distribution Pois(λ) is used as an approximation to Binomial B(n, p) when n is large, p is small, and np = λ is moderate. Solve the following problem using the Binomial and Poisson distributions. Compare the answers. 26

27 8.4 Negative Binomial Distribution Let r 1. Suppose we repeat Bernoulli trials until we get success S r times. Let X denote the number of failures before r th success. For example, an outcome with r = 3 could be P(FFFSSFFFS) = (1 p) 6 p 3 To find P(X = k) we have to count the number of combination of r 1 letters S among k + r 1 letters F and S. Thus, ( ) k + r 1 P(X = k) = (1 p) k p r, k = 0,1,2,3... r 1 Definition 9. X is called negative binomial random variable, NB(r, p). Problem Probability to hit the target is 60%. The target is destroyed after 3 hits. Find the probability that it takes more than 6 shots to destroy the target. Let X be the number of times we miss, before target is destroyed. X is NB(3,.6). We need the probability that it takes 7 or more shots. With 3 hits, we need the number of missed shots to be 4 or more. 9 Lecture 9 Test I P(X 4) = k=4 ( k ) (.4) k (.6) Lecture Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = 1) = P(X = 1) = 1/2. Suppose you start with x dollars and play the game n times. Let X 1,X 2,...,X n be payoffs in each of the game. Your accumulated wealth after n games is S n = x + X 1 + X X n. Each S n is a discrete random variable. The sequence S n, n = 1,2,..., is called a Random Walk. If you compute you expected accumulated wealth after n turns: E[S n ] = x, 27

28 as you would expect not to gain anything from this game. Suppose that the game for you ends once you have no money left and cannot borrow. This is expressed by the condition S n = 0, to which we refer as to go bankrupt. Lets show that P( to go bankrupt ) = 1. For simplicity, lets assume that x is an integer. The probability in question is a function of x only, and we denote it q x = P( to go bankrupt if you start with $x ). We going to look at how q x depends on x. First of all, q 0 = 1. If you have no money, you re bankrupt. Using the law of total probability we can write: q x = 1 2 q x q x+1, x 1. This is a system of difference equations. We solve by trying a formula q x = Ax + B, for some A,B. It follows from the initial condition that B = 1. Then, q x = Ax + 1. But if A 0, then for large x, q x is either negative, or greater than 1, which is not possible for a probability. So, A = 0, and q x = 1, for all x. Suppose you have an option to stop or continue playing anytime you want, but you have to stop if you go bankrupt: S n = 0. Consider the following strategy: set a goal of N dollars. You decided to stop the game when you reach the goal, provided, of course that didn t go bankrupt before. Find the probability that you reach the goal before going bankrupt. Set p x as the probability to reach the goal before going bankrupt when we start with $x dollars. Use the Law of Total Probability to show that p x verifies a system of difference equations: 1 p x = p x p 1 x 1 2, 1 x N 1 p 0 = 0, p N = 1. Solve the system of equations assuming p x = Ax + B, for some A,B. Answer: p x = x/n. Let Y be your total winnings (the return) in the above game. Find E[Y ]. Answer: E[Y ] = (N x)x/n x(1 x/n) = 0. 28

29 Find the probability q x to go bankrupt before reaching the goal, if you start with $x dollars. Answer: q x = 1 x/n. What happens if N +? 10.2 Continuous Random Variables Let X be a random variable, i.e., the function from the sample space S into R. The cumulative distribution function (CDF) is defined as F X (x) = P(X x). Definition 10. X is a continuous random variable if F X (x) is a differentiable function. The derivative f (x) = df X dx is called the probability density function (PDF) of X. Properties of continuous random variables 1. PDF f (x) is a non-negative function f (x)dx = For any a b, P(a < X b) = F X (b) F X (a) = b a f (x)dx 4. For any x, P(X = x) = For any a b, P(a < X < b) = P(a X b) = P(a X < b) = b a f (x)dx. Definition 11. A random variable X which is neither discrete or continuous, is called a mixed random variable. 29

30 10.3 Expectation and Variance 1. The expectation of X is defined as an integral + E[X] = x f (x)dx. 2. The variance and the standard deviations are defined as V (X) = + (x E[X]) 2 f (x)dx = σ = V (X). 3. Tschebyshev s inequality: for any k > 0, + P( X µ > k) σ 2 k Properties of the Expectation and Variance: x 2 f (x)dx (E[X]) 2, E[aX + b] = ae[x] + b, V (ax + b) = a 2 V (X). Problem Prove both formulas. Hint: if PDF of X is f (x) what is the PDF of Y = ax + b? Find the PDF of X and its mean E[X] Uniform Distribution X is a uniform random variable on interval [a, b] if its PDF, f (x) takes a constant value on the interval [a,b] and is zero otherwise: { 1 a x b, f (x) = b a 0 x < a or x > b. Problem Find the mean E[x], and the variance V (X) of a uniform random variable. Answer E[X] = b a (b a)2 2, V (X) = 12. Problem Let X be the time (in minutes) a bus arrives at the bus stop. Suppose that X is a uniform random variable over the interval from 1:00pm to 2:00pm. Find the probability that the bus will arrive between 1:30 and 1:45pm. Now, you arrive at the bus stop at 1:15pm and you find out that the has not yet passed. Find the probability the bus arrives in the next 20 minutes. Hint: For the second question you need to find P(15 < X 35 X > 15). 30

31 10.5 Exponential Random Variable Let X be a Poisson random variable Pois(λ), which counts a number of occurrences of some event in a unit of time, starting at time zero. Let T be the time of the first event occurred. For time t > 0, what is the probability that P(T > t)? One way to find this probability is the following. Let Y counts the same events in the interval [0,t]. Y is Poisson Pois(λt). This can be shown using a binomial approximation of the Poisson distribution. Then Why? Then, since The CDF of T : P(T > t) = P(Y = 0). P(Y = 0) = e λt, P(T > t) = e λt. F T (t) = 1 e λt, t > 0, and zero otherwise. The PDF of T : { λe λt t > 0, (10.1) {?} f (t) = 0 t 0. Definition 12. T with PDF (??) is called an exponential random variable, Exp(1/λ). The textbook uses a parameter θ = 1 λ for the formula for f (t). θ has a meaning of an average waiting time, since We can also write the PDF of T : f (t) = E[T ] = θ. { 1θ e t θ t > 0, 0 t 0. Problem Show that the time between i th and (i+1) th events is exponential with PDF given above Properties of Exponential Distribution Problem Find the mean E[X], and the variance V (X) of an exponential random variable. Answer: E[X] = 1/λ, V (X) = 1/λ 2. Problem Exponential distribution is a unique distribution that has the property of being memoryless. For any t > s 0, Verify this property. P(T t + s T > s) = P(T t). 31

32 Problem This property can be equivalently expressed as P(T > t + s T > s) = P(T > t). Problem Suppose that a number of miles that a car can run before its battery wears out is exponential random variable with an average value of 10,000 miles. Find the probability that a car still operational after the first 5,000 miles. Find the probability that the battery will work after 25,000 miles given that didn t fail for the first 20,000 miles. Answer: Both probabilities are the same and equal The next problem shows a connection between Poisson and Uniform distributions. Problem Let N be Poisson Pois(λ) random variable that counts the number of customers in a store in the first hour after the store opens. Let X denote the time of arrival of the first customer. Show that conditioned on the event {N = 1}, P(0 < X x N = 1} = x, x (0,1). This means that conditioned on fact that there is only one customer in the first hour, the time of her arrival is uniform random variable. Hint: think of a Poisson random variable as a binomial random variable B(n,λ/n), when n is large, and use the formula P(0 < X x N = 1} = 10.6 Normal Distribution P({0 < X x}{n = 1}). P(N = 1) X is called a normal random variable with mean µ and standard deviation σ, if its PDF, 1 (x µ)2 f (x) = e 2σ 2. 2πσ 2 Notation: X N(µ,σ). Problem If X is N(µ,σ) then Y = ax + b is N(aµ + b, a σ) Standard Normal random variables Z N(0,1) is called standard normal distribution. If X is N(µ,σ) then Z = X µ σ is a standard normal random variable. 32

33 It is common to find the values of the probability P(0 Z z) = z 0 1 2π e z2 /2 dz = Φ(z), to be given in the Table of the Standard Normal Variable, or programmed in a calculator. Problem Let X be a normal random var. N(10,2). Using the table values of Φ(z) determine P(8 X 13). Answer: P = Problem (Prob from the textbook). A machine produces steel shafts where the diameter has a normal distribution with mean and standard deviation 0.01 inch. Quality control requires diameters fit into the interval 1.00 ± 0.02 inches. What percentage of output will fail the quality control? Answer: 7.3% Three Percentiles The following rules are often use estimate the probabilities of a normal random variable X : P(µ σ X µ + σ) = 68%, P(µ 2σ X µ + 2σ) = 95%, P(µ 3σ X µ + 3σ) = 99.7% Normal Approximation of Binomial Let X be a binomial B(n, p). E[X] = np, σ = np(1 p). The random variable X np np(1 p) is a random variable with zero mean and standard deviation 1. Let Z be a standard normal variable. Theorem 2 (De Moivre-Laplace theorem). For any a b, as n +, ( ) P a X np b P(a Z b). np(1 p) Approximation is good when np(1 p)

34 Problem (From [3]) An unknown fraction p of a population are smokers. A random sampling with replacement of size n is taken. How large n should be so that the error in estimating p is less than 0.5% Let X be the number of smokers in the sample. X is binomial B(n, p) random variable. The estimated number of smokers is We d like to have X n X/n p with high probability. For that, we will select a confidence level, say 0.95, and require that P( X/n p 0.005) Rewrite probability as P( X/n p 0.005) = P( 0.005n X np 0.005n) ( = P n X np n ). p(1 p) np(1 p) p(1 p) By the De-Moivre-Laplace theorem the last probability is well approximated by ( ) ( P n Z n n = 2Φ ), p(1 p) p(1 p) p(1 p) where Z is a standard normal variable. From the table we find that n p(1 p) 1.96, and n 392 p(1 p). But for any p [0, 1], p(1 p) 1/4. Thus, it is enough to take n 392/2 = 196, n Further examples of random variables Problem Suppose a man, standing at the point (0, 1), shoots an arrow at the target that located at (0,0) by randomly choosing an angle from [ π/2,π/2]. Let X denote the distance along the x-axis from the origin to the point where the arrow lands. Find the PDF of this random variable. Compute E[X]. A random variable with such PDF is called Cauchy Random Variable. A random variable that has either infinite expectation or infinity variance is sometimes referred to as a heavy tail random variable (or distribution). 34

35 Solution Let Θ be the angle the archer selects. Θ is uniform random variable Uni([ π/2, π/2]. The distance from the target X = 1 tanθ. Let x > 0. P(X x) = P( tanθ 1/x) = 2P(tanΘ 1/x) It follows that Observe now that E[X] = 2 π = 2P(Θ arctan(1/x)) = f (x) = d dx P(X x) = 2 π x 1 + x 2 dx = 1 + π z π/2 1 π dθ = 2 ( π ) π 2 arctan(1/x). arctan(1/x) x 2. dz = ln(1 + z) π + 0 = +. Problem (Compound distribution) Suppose you are given a choice to play Game 1 or Game 2. The payoff in Game 1 is a random variable X 1 with CDF F 1 (x), and the payoff in Game 2 is a random variable X 2 with CDF F 2 (x). After comparing X 1 and X 2 you have decided that you have a slight preference to play Game 1 than Game 2, but you also like to play Game 2 from time to time. Let p (0, 1), with p > 1/2, be the weight that describes you preference of Game 1 over Game 2. In deciding which game to play, you use the following setup: flip an unfair coin that lands heads with probability p. If coin lands heads, play Game1 and get payoff X 1. ; if tails play Game 2 and get payoff X 2. Find the CDF of the payoff in this compound game. Answer F(x) = pf 1 (x) + (1 p)f 2 (x). 11 Multivariate Random Distributions 11.1 Discrete bi-variate random variables Let X,Y be two discrete random variables with values of X : and values of Y : {x 1,...,x n } {y 1,...,y m }. Definition 13. The joint probability mass function of X and Y, is a function p(x,y) : p(x i,y j ) = P({X = x i }{Y = y j }), i = 1..n, j = 1..m. 35

36 The PMF of (X,Y ) is non-negative and n m i=1 j=1 p(x i,y j ) = 1. Marginal distributions. The PMF of X, p X (x) can be determined from p(x,y) by p X (x i ) = p(x i,y j ), j and likewise, p Y (y) : p Y (y j ) = p(x i,y j ). i p X, p Y are called marginal probability mass functions. The information about the joint PMF can be describe by a contingency table: x 1... x n p Y (y) y 1 p(x 1,y 1 )... p(x n,y 1 ) p Y (y 1 ) y m p(x 1,y m )... p(x n,y m ) p Y (y m ) p X (x) p X (x 1 )... p X (x n ) 1 Problem You have 3 boxes. Box 1 has 10 light bulbs of which 4 are defective, box 2 has 6 light bulbs of which 1 is defective, box 3 has 8 light bulbs of which 3 are defective. A box is chosen at random and then a bulb is randomly selected from this box. Let X be the box selected, and Y be the random variable { 1 the bulb is defective Y = 0 the bulb is non-defective Find joint probability mass function for (X,Y ). Problem Find the marginal probability mass functions. Definition 14. Random variables X,Y are called independent if for all (x,y), p(x,y) = p X (x)p Y (y) Continuous bi-variate random variables Definition 15. We say that a pair of random variables X,Y are continuous bi-variate random variables if there is a function f (x,y) such that P(a < X b, c < Y d) = b d a c f (x,y)dydx, for any a,b,c,d. The function f (x,y) is called the joint probability density function. 36

37 Properties of joint PDF 1. f (x,y) is a non-negative function and + f (x,y)dydx = For any set A R 2, P((X,Y ) A) = A f (x,y)dydx Marginal PDFs We can obtain PDF of each variable separately from the following argument: b ( + ) P(a < X b) = P(a < X b, < Y < + ) = f (x,y)dy dx. a So, In a similar way f X (x) = f Y (y) = + + f (x,y)dy. f (x,y)dx. f X (x), f Y (y) are called marginal PDFs of f (x,y). Definition 16. Random variables X,Y are called independent if f (x,y) = f X (x) f Y (y), (x,y). Problems (From [5]) A joint probability density function of a random variables X,Y is given by { 2(1 x) 0 x 1, 0 y 1 f (x,y) = 0 otherwise Find P(X < 0.5, 0.4 < Y < 0.7). Answer: Find P(X < Y, Y < 0.5). 37

38 (From [2]) Suppose that a point is chosen at random insider a circle of radius R centered at the origin. Let X,Y be the coordinates of the point. The join PDF is given as { c x f (x,y) = 2 + y 2 < R 2 0 otherwise for some constant c. 1. Find c. 2. Find the marginal PDF of X and Y. 3. Let Z be the distance from (X,Y ) to the origin. Find the PDF of Z. 4. Find E[Z]. Note, that the expected value is not R/2, but R/ 2, this is the radius that divides the area of the circle in half. For a pair of random variables X,Y with joint PDF { e (x+y) 0 < x < +, 0 < y < + f (x,y) = 0 otherwise find the PDF of the function Z = X/Y. Answer: f Z (z) = 1/(z + 1) 2, 0 < z < Some special multivariate distributions 1. Bivariate uniform distribution: let A be a set in positive area in R 2. If (X,Y ) are the coordinates of a point we choose at random inside A, then the joint PDF equals: { 1/ A (x,y) A f (x,y) = 0 otherwise Problem Show that if A is a square [a,b] [c,d], then (X,Y ) are independent random variables. 2. Bivariate normal distribution. (X,Y ) is said to be bivariate normal variables if for some numbers (µ 1, µ 2 ) and (a,b,c) such that a > 0, (ab c 2 ) > 0, f (x,y) = 1 2π (ab c 2 ) e a(x µ 1)2 /2 c(x µ 1 )(y µ 2 ) b(y µ 2 )2 /2, 3. Multinomial distribution. This is a multivariate discrete distribution, that generalizes binomial distribution. Suppose we deal with n independent repetitions of an experiment that has k outcomes with probabilities p 1, p 2,..., p k, p k = 1. i Let X 1 count the number of times outcome 1 occurred, X 2 counts the number of times 2 occurred, and so on. Note that always, X X k = n. 38

39 Then P(X 1 = n 1,X 2 = n 2,...,X k = n k ) = n! n 1!n 2!...n k! pn 1 1 pn pn k k. This is called the multinomial distribution Expectation, Variance, Covariance and Correlation 1. The expectations E[X],E[Y ] are given by the formulas E[X] = x f X (x)dx = x f (x,y)dxdy, E[Y ] = y f Y (y)dy = y f (x,y)dxdy, V (X) = E[(X E[X]) 2 ], V (Y ) = E[(Y E[Y ]) 2 ]. 2. The covariance of X,Y is defined as cov(x,y ) = E[(X E[X])(Y E[Y ])] = E[XY ] E[X]E[Y ]. 3. The correlation of X and Y, is defined as ρ(x,y ) = cov(x,y ) V (X)V (Y ). 4. Show that for all random variables X,Y. 1 ρ(x,y ) 1, 5. If X,Y are independent random variables, then E[XY ] = E[X]E[Y ], cov(x,y ) = Note, however, that cov(x,y ) = 0 doesn t always imply that X,Y are independent random variables. Consider an example: X is uniform on [ 1, 1] and Y = X 2. Show that cov(x,y ) = 0, but X and Y are not independent. Problem Compute correlation ρ for random variables (X,Y ), with PDF { 1/2 0 < x < y, 0 < y < 2 f (x,y) = 0 otherwise 39

40 Answer: E[X] = 2/3, E[Y ] = 4/3, E[XY ] = 1, cov(x,y ) = 1/6, V (X) = 2/9, V (Y ) = 2/9, ρ = 1/2. Further properties of expectations. Let X i, i = 1..n, be random variables of any kind. Let S n = X X n. Then, and Var(S n ) = i E[S n ] = E[X i ], i Var(X i ) + cov(x i,x j ). i< j An important special case: when X i and X j are independent for all i j, then Var(S n ) = Var(X i ). i Problem Use the last formula to compute the variance of the a binomial random variable X B(n, p) The Weak Law of Large Numbers and The Central Limit Theorem Consider a sequence X 1,X 2,X 3,..., of pair-wise independent random variables with the same PDFs. Denote µ = E[X 1 ], σ = V (X 1 ). Think of X i as repeated measurements of some physical parameter (say electric current). We d like to study the statistics of the average Lets compute E[S N ] and V (S N ) : S N = X X N. N E[S N ] = µ, V (S N ) = σ 2 N. Notice that V (S N ) decreases to zero as N +, and E[S N ] is independent of N. Thus, the effect of averaging produces a more accurate estimate of µ as the number of measurement increases. To describe this effect mathematically, we use Tschebyshev s inequality for S N : Thus, for any ε > 0, P( S N µ > ε) V (S N) ε 2 = σ 2 ε 2 N. limp( S N µ > ε) = 0. This is the Weak Law of Large Numbers. 40

41 Central Limit Theorem. There is even more that one can say about S N. Lets normalize S N so that is has zero mean and unit standard deviation: Z N = S N µ σ X X N N = σ. N Then, the Central Limit Theorem states that for any a,b, where Z is N(0,1) random variable. limp(a < Z N < b) = P(a < Z < B), To state it slightly differently: the average of the sum of large number of pairwise independent, identically distributed random variables approximately equals to X X N N µ + σ N Z, i.e. equals to the mean µ with the error that has normal distribution Bi-variate Normal Distribution Problem Determine E[X], E[Y ],Var(X),Var(Y ), cov(x,y ) for bi-variate normal random variable. Problem Show bi-variate normal random variables (X,Y ) are independent iff cov(x,y ) = 0. Problem Show that there is a linear transformation that makes (X,Y ) independent. Let (X,Y ) be the deviation from the center of a target in a shooting experiment. Let f (x,y) be the joint PDF of (X,Y ). We will postulate that 1. (X,Y ) are independent and identically distributed, i.e.: for some PDF f 0, f (x,y) = f 0 (x) f 0 (y), 2. and the probability density g(r,θ) of the random variables (R,Θ) the polar coordinates of (X,Y ), does not depend on θ. We will prove that under these two conditions for some σ. Show first that f (x,y) = 1 x 2 +y 2 2πσ 2 e 2σ 2, g(r,θ) = r f (r cosθ,r sinθ). 41

42 The second assumptions implies that for some function p(r), f (r cosθ,r sinθ) = p(r)/r, or, f (x,y) = p( x 2 + y 2 )/ x 2 + y 2 h(x 2 + y 2 ). The first assumption implies then, that f 0 (x) f 0 (y) = h(x 2 + y 2 ), for some function h of a single argument. Take y = 0, and assume f 0 (0) 0. Then f 0 (x) = h(x 2 )/ f 0 (0). Call c = f 0 (0). Then, h(x 2 )h(y 2 ) = c 2 h(x 2 + y 2 ), for all x,y. The only function that verifies that equation is h(x 2 ) = c 2 e αx2, for any α. Thus we showed that f (x,y) = c 2 e α(x2 +y 2). Since f (x,y) is a PDF, the integral over all of R 2 must be equal to 1. This implies that for some σ, f is as claimed Conditional Distribution Problem Let Y be a randomly chosen number from the list {1,...,n}. Let X be the outcome of the flip of biased coin with the probability of Tails, p = 1/Y. Let p(x,y) be the joint PMF of (X,Y ). For a fixed value of y, the function p(x, y) P(Y = y) is a probability mass function of X. Verify that. Let (X,Y ) be a pair of discrete random variables with values {x 1,...,x n }, and {y 1,...,y m } and the joint PMF p(x,y). Definition 17. The conditional distribution of X given Y = y k, with P(Y = y k ) > 0, is the probability mass function p (X Y =yk )(x i ) = P(X = x i Y = y k ), i = 1...n. 42

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Conditional probability

Conditional probability CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Practice Exam 1: Long List 18.05, Spring 2014

Practice Exam 1: Long List 18.05, Spring 2014 Practice Eam : Long List 8.05, Spring 204 Counting and Probability. A full house in poker is a hand where three cards share one rank and two cards share another rank. How many ways are there to get a full-house?

More information

Chapter 2 PROBABILITY SAMPLE SPACE

Chapter 2 PROBABILITY SAMPLE SPACE Chapter 2 PROBABILITY Key words: Sample space, sample point, tree diagram, events, complement, union and intersection of an event, mutually exclusive events; Counting techniques: multiplication rule, permutation,

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2 Probability Probability is the study of uncertain events or outcomes. Games of chance that involve rolling dice or dealing cards are one obvious area of application. However, probability models underlie

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

2. AXIOMATIC PROBABILITY

2. AXIOMATIC PROBABILITY IA Probability Lent Term 2. AXIOMATIC PROBABILITY 2. The axioms The formulation for classical probability in which all outcomes or points in the sample space are equally likely is too restrictive to develop

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Discrete Distributions

Discrete Distributions Discrete Distributions STA 281 Fall 2011 1 Introduction Previously we defined a random variable to be an experiment with numerical outcomes. Often different random variables are related in that they have

More information

Exam 1 - Math Solutions

Exam 1 - Math Solutions Exam 1 - Math 3200 - Solutions Spring 2013 1. Without actually expanding, find the coefficient of x y 2 z 3 in the expansion of (2x y z) 6. (A) 120 (B) 60 (C) 30 (D) 20 (E) 10 (F) 10 (G) 20 (H) 30 (I)

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Counting principles, including permutations and combinations.

Counting principles, including permutations and combinations. 1 Counting principles, including permutations and combinations. The binomial theorem: expansion of a + b n, n ε N. THE PRODUCT RULE If there are m different ways of performing an operation and for each

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

Chapter 1: Revie of Calculus and Probability

Chapter 1: Revie of Calculus and Probability Chapter 1: Revie of Calculus and Probability Refer to Text Book: Operations Research: Applications and Algorithms By Wayne L. Winston,Ch. 12 Operations Research: An Introduction By Hamdi Taha, Ch. 12 OR441-Dr.Khalid

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Probability 2017/18 Supplementary Problems Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A

More information

Review of Probabilities and Basic Statistics

Review of Probabilities and Basic Statistics Alex Smola Barnabas Poczos TA: Ina Fiterau 4 th year PhD student MLD Review of Probabilities and Basic Statistics 10-701 Recitations 1/25/2013 Recitation 1: Statistics Intro 1 Overview Introduction to

More information

Random Variables Example:

Random Variables Example: Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5 s in the 6 rolls. Let X = number of 5 s. Then X could be 0, 1, 2, 3, 4, 5, 6. X = 0 corresponds to the

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

Probability Theory and Applications

Probability Theory and Applications Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms.

7. Be able to prove Rules in Section 7.3, using only the Kolmogorov axioms. Midterm Review Solutions for MATH 50 Solutions to the proof and example problems are below (in blue). In each of the example problems, the general principle is given in parentheses before the solution.

More information

Statistical Theory 1

Statistical Theory 1 Statistical Theory 1 Set Theory and Probability Paolo Bautista September 12, 2017 Set Theory We start by defining terms in Set Theory which will be used in the following sections. Definition 1 A set is

More information

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails.

Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails. Example: Suppose we toss a quarter and observe whether it falls heads or tails, recording the result as 1 for heads and 0 for tails. (In Mathematical language, the result of our toss is a random variable,

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems. Ven Mudunuru. From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru, Venkateswara Rao STA 2023 Spring 2016 1 1. A committee of 5 persons is to be formed from 6 men and 4 women. What

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6 With SOLUTIONS Mudunuru Venkateswara Rao, Ph.D. STA 2023 Fall 2016 Venkat Mu ALL THE CONTENT IN THESE SOLUTIONS PRESENTED IN BLUE AND BLACK

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

ISyE 6739 Test 1 Solutions Summer 2015

ISyE 6739 Test 1 Solutions Summer 2015 1 NAME ISyE 6739 Test 1 Solutions Summer 2015 This test is 100 minutes long. You are allowed one cheat sheet. 1. (50 points) Short-Answer Questions (a) What is any subset of the sample space called? Solution:

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and sample point

More information

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them Math 302.102 Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them solely for their final exam preparations. The final exam

More information

PROBABILITY.

PROBABILITY. PROBABILITY PROBABILITY(Basic Terminology) Random Experiment: If in each trial of an experiment conducted under identical conditions, the outcome is not unique, but may be any one of the possible outcomes,

More information

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

3 PROBABILITY TOPICS

3 PROBABILITY TOPICS Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information