MATH/STAT 3360, Probability FALL 2018 Hong Gu

Size: px
Start display at page:

Download "MATH/STAT 3360, Probability FALL 2018 Hong Gu"

Transcription

1 MATH/STAT 3360, Probability FALL 2018 Hong Gu Lecture Notes and In Class Examples August 19, / 142

2 Basic Principal of Counting Question 1 A statistics textbook has 8 chapters. Each chapter has 50 questions. How many questions are there in total in the book? August 19, / 142

3 Basic Principal of Counting Question 2 How many 7-digit telephone numbers can be assigned, so that no two numbers differ in just one digit (so for example and could not both be assigned)? [Hint: consider the sum of the digits modulo 10.] August 19, / 142

4 Permutations Question 3 How many 9-digit numbers are there that contain each of the digits 1 9 once. August 19, / 142

5 Permutations Question 4 A band are planning a tour of Canada. They are planning to perform in 15 venues across Canada, before returning to their home in Halifax. They want to find the cheapest way to arrange their tour. The only way to do this is to work out the cost of all possible orderings of the venues. They have a computer, which can calculate the cost of 100,000,000 orderings of venues every second. (a) How long will it take the computer to calculate the cost of all possible orderings of the venues? (b) What if they want to extend the tour to 18 venues? August 19, / 142

6 Permutations Question 5 In an Olympic race with 8 contestants, how many possibilities are there for the medal finishes? That is, how many different ways are there to choose 3 contestants in order? August 19, / 142

7 Combinations Question 6 How many 3-element subsets are there in a 12-element set? August 19, / 142

8 Combinations Question 7 A B In the above diagram, how many shortest paths are there from A to B? August 19, / 142

9 Multinomial Coefficients Question 8 How many distinct ways can the letters of the word PROBABILITY be arranged. August 19, / 142

10 Multinomial Coefficients Question 9 A professor decides to award a fixed number of each grade to his students. He decides that among his class of 10 students, he will award one A, two B, three C, three D and one fail. How many different results are possible in the course. August 19, / 142

11 Multinomial Coefficients Question 10 In the polynomial (x + y + z) 6, what is the coefficient of x 2 y 3 z? August 19, / 142

12 Multinomial Coefficients Question 11 How many ways are there to divide a class of 20 students into 10 pairs if the order of the 10 pairs doesn t matter? August 19, / 142

13 Sample Spaces and Events Some rules on the operations of events Commutative laws :E F = F E; EF = FE Associative laws :(E F) G = E (F G); (EF )G = E(FG) Distributive laws :(E F)G = EG FG; EF G = (E G)(F G) ( n ) c n DeMorgan s laws : E i = Ei c ; i=1 ( n ) c E i = i=1 i=1 n i=1 E c i August 19, / 142

14 Sample Spaces and Events Question 12 For events A, B, C and D, how are the events (AB) (CD) and (A C)(B D) related? August 19, / 142

15 Axioms of Probability The three axioms of probability Axiom 1: 0 P(E) 1 Axiom 2: P(S) = 1 Axiom 3: For any sequence of mutually exclusive events E 1, E 2,... ( ) P E i = p(e i ) i=1 i=1 August 19, / 142

16 Axioms of Probability Question 13 For an experiment with sample space {1, 2, 3, 4, 5}, is there a probability P such that P({1, 2}) = 0.4, P({1, 3}) = 0.6 and P({2, 3}) = 0.1? August 19, / 142

17 Axioms of Probability Question 14 For an experiment with sample space {1, 2, 3, 4, 5}, is there a probability P such that P({1, 2, 3}) = 0.4, P({1, 3, 4}) = 0.6 and P({2, 4, 5}) = 0.1? August 19, / 142

18 Simple Propositions propositions Prop 1: P(E c ) = 1 P(E) Prop 2: If E F, then P(E) P(F ) Prop 3: P(E F) = P(E) + P(F) P(EF) n Prop 4: P(E 1 E 2 E n ) = P(E i ) P(E i1 E i2 ) + i=1 i 1 <i 2 + ( 1) r+1 P(E i1 E i2 E ir ) i 1 <i 2 < <i r + + ( 1) n+1 P(E 1 E 2 E n ) August 19, / 142

19 Simple Propositions Question 15 For events A, B and C, we know that P(A) = 0.6, P(B) = 0.5 and P(C) = 0.4, and that P(AB) = 0.4, P(AC) = 0.3 and P(BC) = 0.2, and P(ABC) = 0.1. What is P(A B C). August 19, / 142

20 Simple Propositions Question 16 For events A, B and C, we know that P(A) = 0.5, P(B) = 0.7 and P(C) = 0.3, and that P(AB) = 0.3, P(AC) = 0.2 and P(BC) = 0.2, and P(A B C) = 0.9. What is P(ABC)? August 19, / 142

21 Simple Propositions Question 17 A fair coin is tossed 5 times. (a) What is the probability that the sequence HHH occurs somewhere in the 5 tosses? (b) What is the probability that the sequence THT occurs somewhere in the 5 tosses? August 19, / 142

22 Simple Propositions Question 18 What is the probability that a number chosen uniformly at random in the range 1 1,000,000 is divisible by at least one of 2, 3, or 5? August 19, / 142

23 Simple Propositions Question 19 You are planning to learn some languages. You want to maximise the number of people with whom you will be able to speak, or equivalently, the probability that you will be able to speak with a randomly chosen person. You perform some preliminary research and find out the following: Language Proportion of people who speak it Chinese 19.4% English 7.3% Hindi 7.1% Spanish 6.0% Arabic 4.0%... August 19, / 142

24 Question (continued) You also investigate what proportion of people speak each pair of languages, as summarised in the following table. Chinese English Hindi Spanish Arabic Chinese 0.2% 0.1% 0.0% 0.1% English 0.2% 1.8% 1.6% 0.7% Hindi 0.1% 1.8% 0.0% 0.0% Spanish 0.0% 1.6% 0.0% 0.1% Arabic 0.1% 0.7% 0.0% 0.1% If you assume that the number of people speaking three languages is small enough to be neglected, which three languages should you learn in order to maximise the number of people to whom you can speak? August 19, / 142

25 Sample Spaces of Equally Likely Events P(E) = number of outcomes in E number of outcomes in S August 19, / 142

26 Sample Spaces of Equally Likely Events Question 20 What is the probability that 13 randomly chosen cards from a standard deck include all four aces? August 19, / 142

27 Sample Spaces of Equally Likely Events Question 21 A particular genetic disease affects people if and only if they have two defective copies of the gene in question. [Everyone has two copies of the gene, one from each parent, and passes on one copy at random to each of their children.] If both members of a couple have a sibling with the disease (but neither the couple non their parents have the disease) (a) what is the probability that their first child will have the disease? (b) If they have two children, what is the probability that both will have the disease? August 19, / 142

28 Sample Spaces of Equally Likely Events Question 22 What is the probability that a poker hand (with five cards in total) contains exactly three Aces? August 19, / 142

29 Probability as a Continuous Set Function proposition 6.1 If E n, n 1 is either an increasing or a decreasing sequence of events, then lim n P(E n ) = P(lim n E n ) August 19, / 142

30 Conditional Probability Conditional Probability Conditional probability definition The multiplication rule: P(E 1 E 2 E 3 E n ) = P(E 1 )P(E 2 E 1 )P(E 3 E 1 E 2 ) P(E n E 1 E n 1 ) August 19, / 142

31 Conditional Probability Question 23 What is the probability that the sum of two dice is at least 7 conditional on it being odd? August 19, / 142

32 Source: August 19, / 142

33 Conditional Probability Question 24 Two of your friends often talk to you about the books they like. From experience, you have found that if Andrew likes a book, the probability that you will also like it is 70%, while if Brian likes a book, the probability that you will also like it is 40%. You are shopping with another fried, who tells you that she remembers Andrew telling her that he liked the book you are considering buying, but she s not completely sure that it was Andrew who told her, rather than Brian. If the probability that it was Andrew who told her is 80%, what is the probability that you will like the book? August 19, / 142

34 Conditional Probability Question 25 Two fair 6-sided dice are rolled. (a) What is the probability that the larger of the two numbers obtained is 5, given that the smaller is 3. [By the smaller number is 3 I mean that one of the numbers is 3, and the other is at least 3, so that if the roll is (3,3), the smaller number is 3.] (b) One of the dice is red, and the other is blue. What is the probability that the number rolled on the red die is 5 given that the number rolled on the blue die is 3 and the number rolled on the red die is at least 3? (c) Why are the answers to (a) and (b) different? August 19, / 142

35 Conditional Probability Question 26 You are a contestant in a game-show. There are three doors, numbered 1, 2, and 3, one of which has a prize behind it. You are to choose a door, then the host will open another door to show that the prize is not behind it (if the door you chose has the prize behind it, he chooses which door to open at random). You then have the opportunity to switch to the other unopened door. (a) Should you switch to door 3? (b) What is your probability of winning (assuming you make the best choice of whether or not to switch)? August 19, / 142

36 Warning Source: August 19, / 142

37 Conditional Probability Question 27 You are a contestant in a game-show. There are three doors, numbered 1, 2, and 3, one of which has a prize behind it. You are to choose a door, then the host will open another door to show that the prize is not behind it (if the door you chose has the prize behind it, he chooses which door to open at random). You then have the opportunity to switch to the other unopened door. In this case, you have cheated and know that the prize is not behind door 2. You choose door 1, and the host opens door 2. (a) Should you switch to door 3? (b) What is your probability of winning (assuming you make the best choice of whether or not to switch)? August 19, / 142

38 Bayes Theorem Law of total probability P(E) = P(EF) + P(EF c ) = P(E F)P(F) + P(E F c )P(F c ) Suppose that F 1, F 2,, F n are mutually exclusive events such that n n F i = S, thus E = EF i i=1 i=1 we have n n P(E) = P(EF i ) = P(E F i )P(F i ) i=1 i=1 August 19, / 142

39 Bayes Theorem Bayes Theorem P(F j E) = P(EF j) P(E) = P(E F j )P(F j ) n i=1 P(E F i)p(f i ) August 19, / 142

40 Bayes Theorem Question 28 A test for a particular disease has probability of giving a positive result if the patient does not have the disease (and always gives a positive result if the patient has the disease). The disease is rare, and the probability that an individual has the disease is (a) An individual is tested at random, and the test gives a positive result. What is the probability that the individual actually has the disease? (b) A patient goes to the doctor with a rash, which is a symptom of this rare disease, but can also be caused by other problems. The probability that a person without the disease has this rash is If the patients test comes back positive, what is the probability that he has the disease? August 19, / 142

41 Bayes Theorem Question 29 A civil servant wants to conduct a survey on drug abuse. She is concerned that people may not admit to using drugs when asked. She therefore asks the participants to toss a coin (so that she cannot see the result) and if the toss is heads, to answer the question YES, regardless of the true answer. If the toss is tails, they should give the true answer. That way, if they answer YES to the question, she will not know whether it is a true answer. Suppose that 5% of the survey participants actually use drugs. What is the probability that an individual uses drugs, given that they answer YES to the question on the survey? August 19, / 142

42 Independent Events Definition of independent events Two events E and F are independent, if P(EF) = P(E)P(F) Three events E, F and G are independent, if P(EFG) = P(E)P(F)P(G) P(EF) = P(E)P(F) P(EG) = P(E)P(G) P(FG) = P(F )P(G) Extension of the definition to n events and finally an infinite set of events. August 19, / 142

43 Independent Events Question 30 4 fair coins are tossed. Which of the following pairs of events are independent? There are exactly two heads (a) The first coin is a head. among the four tosses. (b) (c) There are exactly two heads among the first three tosses The sequence TTH occurs in the 4 tosses. There are exactly two heads among the last three tosses. The sequence THT occurs. August 19, / 142

44 Independent Events Question 31 Three cards are drawn from a standard deck. Which of the following pairs of events are independent? There are exactly two queens (a) The first card is a heart. among the three cards. (b) At least two of the cards are hearts At least two of the cards are 9s. (c) All three cards are red None of the cards is an Ace August 19, / 142

45 Conditional Probability is a Probability Conditional Probability Conditional probability P(. F) is a probability, thus satisfies all three properties of probability. All propositions for probabilities apply to conditional probability, for example: P(E 1 F) = P(E 1 E 2 F)P(E 2 F) + P(E 1 E c 2 F)P(E c 2 F) Conditional independence: P(E 1 E 2 F) = P(E 1 F)P(E 2 F) August 19, / 142

46 Conditional Probability is a Probability Question 32 For three events A, B and C, we know that P(A C) = 0.4, P(B C) = 0.3, P(ABC) = 0.1, P(C) = 0.8. Find P(A B C). August 19, / 142

47 Discrete Random Variables Discrete Random Variables For a discrete random variable, we define the probability mass function P(a) of X by P(a) = P(X = a) The pmf is positive for at most a countable number of values of a. August 19, / 142

48 Expected Value Expected Value of a discrete random variable definition: E[X] = x:p(x)>0 xp(x) The frequency interpretation: if we win x i with probability p(x i ), the average winnings per game will be n x i p(x i ) = E[X] i=1 August 19, / 142

49 Expected Value Question 33 What is the expected value of the product of the numbers rolled on two fair 6-sided dice. August 19, / 142

50 Expectation of a Function of a Random Variable Expectation of a Function of a Random Variable If a discrete RV X has pmf p(x i ), i 1, then E[g(X)] = i g(x i )p(x i ) = j y j p(g(x) = y j ) If a and b are constants, then E[aX + b] = ae[x] + b August 19, / 142

51 Expectation of a Function of a Random Variable Question 34 You are bidding for an object in an auction. You do not know the other bids. You believe that the highest other bid (in dollars) will be unifomly distributed in the interval (100, 150). You can sell the item for $135. What should you bid in order to maximise expected profit? August 19, / 142

52 Variance Variance definition Var[X] = E[(X µ) 2 ] = E[X 2 ] (E[X]) 2 where E[X] = µ. If a and b are constants, then Var[aX + b] = a 2 Var[X] SD(X) = Var(X) August 19, / 142

53 Variance Question 35 Two fair 6-sided dice are rolled, and the sum is taken. What is the variance of this sum? August 19, / 142

54 Binomial Random Variables Binomial Random Variables The pmf of a Binomial random variable with parameters (n, p) is ( ) n p(i) = p i (1 p) n i, i = 0, 1,, n. i E[X] = np Var(X) = np(1 p) August 19, / 142

55 Binomial Random Variables Question 36 An insurance company sets it s premium for car insurance at $500. They estimate that the probability of a customer making a claim is 1 200, in which case they will pay out $5,000. They sell policies to 100 customers. After costs, the premiums are enough for them to pay out up to 5 claims. What is the probability that the insurance company will have to pay out more than 5 claims? August 19, / 142

56 Poisson Random Variables Poisson Random Variables A RV X is said to be a Poisson RV with parameter λ if, for some λ > 0, λ λi P(X = i) = e, i = 0, 1, 2, i! If X Bin(n, p), where n is large, p is small, so that np is moderate size, then the binomial probability can be approximated by Poisson probability: where λ = np. E(X) = Var(X) = λ λ λi P(X = i) e i!, August 19, / 142

57 Poisson Random Variables Question 37 An insurance company insures 20,000 homes. The probability of a 1 particular home making a claim for $1,000,000 is 15,000. (a) What is the probability that the company receives more than 2 claims? [Calculate the exact probability.] (b) What probability do we calculate for (a) if we use the Poisson approximation? August 19, / 142

58 Poisson Random Variables Question 38 Suppose the number of cars that want to park on a particular street each day is a Poisson random variable with parameter 5. There are 4 parking spots on the street. (a) What is the probability that on a given day, none of the parking spots is used? (b) What is the probability that one of the cars wanting to park on that street will not be able to? (c) What is the expected number of parking spots unused on that street? August 19, / 142

59 Poisson Random Variables Question 39 (Expectation of a Function of a Random Variable) The number of customers a company has on a given day is a Poisson random variable with parameter 5. For each customer, the company receives $300, but to handle n customers, the company has to pay 100n n+1. What is the companies expected profit each day? August 19, / 142

60 Expectation of Sums of Random Variables Expectation of Sums of Random Variables If S is either finite or countably infinite, E(X) = s S X(s)p(s) For random variables X 1, X 2,, X n, n E[ X i ] = i=1 n E[X i ] i=1 August 19, / 142

61 Expectation of Sums of Random Variables Question fair 6-sided dice are rolled, what is the expected value of the total of all 100 dice. August 19, / 142

62 Expectation of Sums of Random Variables Question 41 A gambler starts with $5, and continues to play a game where he either wins or loses his stake, with equal probability, until he either has $25 or has $0. He is allowed to choose how much to bet each time, but is not allowed to bet more than he has, and does not bet more than he would need to win to increase his total to $25. He decides how much to bet each time by rolling a fair (6-sided) die, and betting the minimum of the number shown, the amount he has, and the amount he needs to increase his total to $25. What is the probability that he quits with $25? [You may assume that he will certainly reach either $25 or $0 eventually. Hint: some of the information in this question is not necessary.] August 19, / 142

63 Expectation of Sums of Random Variables Question 42 You are considering an investment. You would originally invest $1,000, and every year, the investment will either increase by 60% or decrease by 60%, with equal probability. You plan to use the investment after 20 years. (a) What is the expected value of the investment after 20 years? (b) What is the probability that the investment will be worth more than $1,000 after 20 years? August 19, / 142

64 Cumulative Distribution Functions Definition and Properties The cdf of RV X is defined as F(x) = P(X x). F is a non-decreasing function lim b F(b) = 1 lim b F (b) = 0 F is right continuous. P(a < X b) = F(b) F(a) P(x < b) = lim n F(b 1 n ) August 19, / 142

65 Cumulative Distribution Functions Question 43 What is the cumulative distribution function for the sum of two fair dice? August 19, / 142

66 Continuous Random Variables Continuous RV and pdf X is a continuous RV, if there exist a function f 0 defined on (, ), and for any set B of real numbers: P(x B) = f (x)dx The function f is called probability density function (pdf). Thus any pdf has two properties. B August 19, / 142

67 Expectation and Variance of Continuous Random Variables Expectation and Variance of Continuous RVs E[X] = + For any real valued function g(x), E[g(X)] = + xf (x)dx g(x)f (x)dx For constants a and b, we have E[aX + b] = ae[x] + b Var[X] = E[(X µ) 2 ] = E[X 2 ] (E[X]) 2 August 19, / 142

68 Expectation and Variance of Continuous Random Variables Question 44 A random variable X has probability density function 2 f X (x) =. 3π(1+ x 2 3 )2 (a) What is the expectation E(X)? 1 (b) What is the variance Var(X)? [Hint: differentiate, and use this 1+ ( ) x2 3 to integrate by parts. Recall that d arctan x 3 ( ), and that 3 arctan( ) = π 2 and arctan( ) = π 2.] 1 dx = 1+ x2 3 August 19, / 142

69 Expectation and Variance of Continuous Random Variables Question 45 A random variable X has a truncated exponential distribution that is, its probability density function is given by: (a) What is the expectation E(X)? (b) What is the variance Var(X)? 0 if x < 0 f X (x) = λe λx if 0 x < a 1 e λa 0 if a x August 19, / 142

70 Uniform Random Variables Uniform Random Variables pdf and cdf of X, if X Unif (α, β). E[X] = β+α 2 Var(X) = (β α)2 12 August 19, / 142

71 Uniform Random Variables Question 46 Which of the following distributions for X gives the larger probability that X > 3? (i) X is uniformly distributed on ( 2, 7). (ii) X is uniformly distributed on (0, 5)? August 19, / 142

72 Normal Random Variables Normal Random Variables pdf and cdf of X, if X N(µ, σ 2 ). If X N(µ, σ 2 ), then Y = ax + b is Normally distributed with N(aµ + b, a 2 σ 2 ). E[X] = µ Var(X) = σ 2 Three types of common problems. Normal approximation to the Binomial distribution. August 19, / 142

73 Normal Random Variables Question 47 Let X be normally distributed with mean 1 and standard deviation 1. What is the probability that X 2 > 1? August 19, / 142

74 Normal Random Variables Question 48 Let X be normally distributed with mean 4 and standard deviation 3. What is the shortest interval such that X has an 80% probability of lying in that interval? August 19, / 142

75 Normal Random Variables Question 49 An online banking website asks for its customers date of birth as a security question. Assuming that the age of customers who have an online banking account is normally distributed with mean 35 years and standard deviation 10 years: (a) how many guesses would a criminal need to make in order to have a 50% chance of correctly guessing the date of birth of a particular customer? (b) If the criminal estimates that he can safely make up to 200 guesses without being caught, what is the probability of guessing correctly. August 19, / 142

76 Normal Random Variables Question 50 A car insurance company sets its premium at $450 for a year. Each customer has a probability of making a claim for $10,000. The company has to cover adminstrative costs of $300 per customer. How many customers does it need in order to have a probability of being unable to cover its costs? [You may use any reasonable approximations for the distribution of the number of claims.] August 19, / 142

77 Normal Random Variables Question 51 The number of visitors to a particular website at a given time is normally distributed with mean 1,300 and variance The server that hosts the website can handle 2,500 visitors at once. (a) What is the probability that there are more visitors than it can handle? (b) The company wants to reduce the chance of having too many visitors to How many visitors does the server need to be able to handle in order for this to be achieved? (c) The company also wants to attract more visitors to the website, so it upgrades the server so that it can handle 3,500 visitors at once. Assuming that the variance remains at 500 2, what is the largest mean number of visitors at a given time, such that the company can maintain its target for the probability of having too many visitors? August 19, / 142

78 Exponential Random Variables Exponential Random Variables pdf and cdf of X, if X exponential(λ). E[X] = 1/λ Var(X) = 1/λ 2 Exponentially distributed RVs are memoryless: P(x > s + t x > t) = P(X > s) for all s, t 0. Hazard rate (failure rate) Function: λ(t) = f (t) F (t) is P(x (t, t + dt) X > t). If f (t) is exponential, then λ(t) = λ. August 19, / 142

79 Exponential Random Variables Question 52 A scientist, Dr. Jones believes he has found a cure for aging. If he is right, people will no longer die of old age (but will still die of disease and accidents at the same rate). In this case, what would life expectancy be for people who do not grow old? [You may assume that people currently die from accidents and disease at a roughly constant rate before the age of about 50, and that after Dr. Jones cure, they will continue die at this rate forever. Currently 93% of people live to age 50.] August 19, / 142

80 Distribution of a Function of a Random Variable Question 53 If X is an exponential random variable with parameter λ, what is the distribution function of X 2? August 19, / 142

81 Distribution of a Function of a Random Variable Question 54 Calculate the probability density function of the square of a normal random variable with mean 0 and standard deviation 1. August 19, / 142

82 Joint Distribution Functions Joint Distribution Functions For two RVs (X, Y ), the joint cdf is defined as F(a, b) = P(X a, Y b) for < a, b <. The (marginal) distribution of X is F X (a) = P(X a, Y < ) = lim b F(a, b) Similarly F Y (b) = F(, b) P(X > a, Y > b) = 1 F X (a) F Y (b) + F (a, b) More generally, P(a 1 < X a 2, b 1 < Y b 2 ) = F(a 2, b 2 ) F(a 1, b 2 ) F(a 2, b 1 ) + F(a 1, b 1 ) If X and Y are both discrete RVs, the joint pmf is defined as p(x, y) = P(X = x, Y = y). August 19, / 142

83 Joint Distribution Functions Joint Distribution Functions If X and Y are both continuous RVs, the joint pdf satisfies P(x A, y B) = B A f (x, y)dxdy. Thus F(a, b) = b a f (x, y)dxdy f (a, b) = f X (x) = f Y (y) = 2 a b F(a, b) f (x, y)dy f (x, y)dx August 19, / 142

84 Joint Distribution Functions Question 55 A fair (6-sided) die is rolled 50 times. What is the probability of getting exactly 7 sixes and 6 fives? August 19, / 142

85 Joint Distribution Functions Question 56 A fair (6-sided) die is rolled 10 times. What is the probability of getting the same number of 5 s and 6 s? August 19, / 142

86 Independent Random Variables Independent Random Variables Random variables X and Y are independent, if P(X A, Y B) = P(X A)P(Y B) for all A and B. This is equivalent to P(X a, Y b) = P(X a)p(y b) for all a and b. Also equivalent to F(a, b) = F X (a)f Y (b) for all a and b. If X and Y are both discrete RVs, p(x, y) = p X (x)p Y (y) for all x and y. If X and Y are both continuous RVs, f (x, y) = f X (x)f Y (y) for all x and y. August 19, / 142

87 Independent Random Variables Question 57 Many men and women go fishing at a popular wharf, a random fisher is a male in probability 2/3, and is a female in probability 1/3. The total number of people who fish at the wharf on a given day is a Poisson random variable with parameter λ, what is the distribution of number of men who fish at the wharf each day? What is the distribution of number of women who fish at the wharf each day? Are the number of men and number of women who fish each day independent? August 19, / 142

88 Joint Probability Distribution of Functions of Random Variables Theorem Let X 1 and X 2 be jointly continuous RVs with joint pdf f X1,X 2. Suppose Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 ), assuming The equations y 1 = g 1 (x 1, x 2 ) and y 2 = g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 with solutions x 1 = h 1 (y 1, y 2 ), x 2 = h 2 (y 1, y 2 ). For any (x 1, x 2 ), the functions g 1 and g 2 have continuous partial g 1 g 1 x 1 x 2 derivatives and J(x 1, x 2 ) = g 2 g 2 0 x 1 x 2 The RVs Y 1 and Y 2 are jointly continuous with joint pdf given by f Y1,Y 2 (y 1, y 2 ) = f X1,X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 = h 1 (y 1, y 2 ) and x 2 = h 2 (y 1, y 2 ). August 19, / 142

89 Joint Probability Distribution of Functions of Random Variables Question 58 (R, Θ) have joint density function f (R,Θ) (r, θ) = 1 2π re r 2 2 for {(r, θ) r (0, ), θ (0, 2π)} and 0 otherwise. Let X = R cos Θ and Y = R sin Θ. (a) What is the joint density function of (X, Y )? (b) Are X and Y independent? August 19, / 142

90 Sums of Independent Random Variables Sums of Independent Random Variables If X and Y are independent, continuous RVs with pdf f X and f Y, then The cdf of X + Y is F X+Y (a) = F X (a y)f Y (y)dy. (the convolution) Thus f X+Y (a) = f X (a y)f Y (y)dy This can also be obtained through the joint density function of X + Y, Y. August 19, / 142

91 Sums of Independent Random Variables Question 59 Show that the minimum of two independent exponential random variables with parameters λ 1 and λ 2 is another exponential random variable with parameter λ 1 + λ 2. August 19, / 142

92 Sums of Independent Random Variables Question 60 If X is uniformly distributed on (0, 2) and Y is uniformly distributed on ( 1, 3), X and Y are independent, what is the probability density function of X + Y? August 19, / 142

93 Sums of Independent Random Variables Question 61 The number of claims paid out by an insurance company in a year is a Poisson distribution with parameter λ. The number of claims it can afford to pay is n. It is considering merging with another insurance company which can afford to pay m claims and which will have to pay out Y claims, where Y is a poisson distribution with parameter µ. Under what conditions on m,n,λ and µ will the considered merger reduce the company s risk of bankrupcy?[you may use the normal approximation to the Poisson for estimating the risk of bankrupcy.] August 19, / 142

94 Sums of Independent Random Variables Question 62 X is normally distributed with mean 2 and standard deviation 1. Y is independent of X and normally distributed with mean 5 and standard deviation 3. (a) What is the distribution of X + Y? (b) What is the probability that Y X > 1? August 19, / 142

95 Conditional Distributions: Discrete Case Conditional Distributions: Discrete Case If X and Y are discrete RVs, for all y such that p Y (y) > 0, The conditional probability mass function of X given Y = y is P X Y (x y) = p(x,y) p Y (y) The conditional probability distribution function of X given Y = y is F X Y (x y) = a x P X Y (a y) If X and Y are independent, p X Y (x y) = P(X = x) August 19, / 142

96 Conditional Distributions: Discrete Case Question 63 (a) What is the joint distribution function of the sum of two fair dice, and the (unsigned) difference [e.g., for the roll (3, 6) or (6, 3) the difference is 3]? (b) What is the conditional distribution of the difference, conditional on the sum being 7? August 19, / 142

97 Conditional Distributions: Continuous Case Conditional Distributions: Continuous Case If X and Y are continuous RVs with joint pdf f (x, y), for all y such that f Y (y) > 0, The conditional probability density function of X given Y = y is f X Y (x y) = f (x,y) f Y (y) The conditional cumulative distribution function of X given Y = y is F X Y (a y) = a f X Y (x y)dx If X and Y are independent, f X Y (x y) = f X (x) August 19, / 142

98 Conditional Distributions: Continuous Case Question 64 X is a normal random variable with mean 0 and variance 1. Y is given as Y = X 3 + N(0, 2 2 ). What is the conditional distribution of X given Y = 2? [You do not need to calculate the normalisation constant.] August 19, / 142

99 Conditional Distributions: Continuous Case Question 65 If the point (X, Y ) is uniformly distributed inside the set {(x, y) x 2 + 2y 2 xy < 5}, what is the conditional distribution of X given that y = 1. August 19, / 142

100 Conditional Distributions: Continuous Case Question 66 A company is planning to run an advertising campaign. It estimates that the number of customers it gains from the advertising campain will be approximately normally distributed with mean 2,000 and standard deviation 500. It also estimates that if it attracts n customers, the time until the advertising campaign becomes profitable is exponentially n 500. distributed with parameter (a) What is the joint density function for the number of new customers and the time before this advertising campaign becomes profitable? (b) What is the marginal distribution of the time before this campaign becomes profitable? August 19, / 142

101 Conditional Distributions: Continuous Case Question 67 Give an example of a joint distribution function for two continuous random variables X and Y such that for any x for which f X (x) > 0, E(Y X = x) = 0 and for any y for which f Y (y) > 0, E(X Y = y) = 0, but such that X and Y are not independent. August 19, / 142

102 Expectation of Sums of Random Variables Expectation of Sums of Random Variables If X and Y are RVs with joint pmf p(x, y), then E[g(X, Y )] = y x g(x, y)p(x, y) If X and Y are RVs with joint pdf f (x, y), then E[g(X, Y )] = g(x, y)f (x, y)dxdy In general when E[X] and E[Y ] are finite, E[X + Y ] = E[X] + E[Y ]. August 19, / 142

103 Expectation of Sums of Random Variables Question 68 A government is trying to demonstrate that speed cameras prevent accidents. It conducts the following study: (i) Pick 10 locations. (ii) Count the number of accidents at each location in the past 3 years. (iii) Choose the location x with the most accidents, and install a speed camera there. (iv) Count the number of accidents at location x in the following 3 years. (v) The number of accidents prevented is the number of accidents at location x during the 3 years before installing the speed camera, minus the number of accidents at location x in the 3 years after installing the speed camera. August 19, / 142

104 Expectation of Sums of Random Variables... Assume that the number of accidents at each site is an independent Poisson random variable with parameter 0.5, and that installing a speed camera at a given location has no effect on the number of accidents at that location. What is the expected number of accidents prevented. [You may ignore the possibility that more than 4 accidents happen at any location, to get a good approximation to the true answer. You may wish to use the expectation formula E(X) = n=1 P(X n) for a random variable X taking values in the natural numbers.] (based on notverygoodatstatistics.html) August 19, / 142

105 Expectation of Sums of Random Variables Question 69 In a particular area in a city, the roads follow a rectangular grid pattern. You want to get from a location A to a location B that is m blocks East and n blocks North of A. (a) How many shortest paths are there from A to B? (b) After a storm, each road segment independently has probability p of being flooded. What is the expected number of shortest paths from A to B that are still useable? (d) [Chapter 8] Use Markov s inequality to find a bound on the probability that there is still a useable shortest path from A to B. August 19, / 142

106 Moments of the Number of Events that Occur Moments of the Number of Events that Occur If RV I i is an indicator variable for event A i, i = 1,, n, Since X = n i=1 I i, E[X] = n i=1 E[I i] = n i=1 p(a i) For pairs of events, ( ) x 2 = i<j I ii j, thus ( ) x E[ ] = E[I i I j ] = P(A i A j ) 2 i<j i<j from which we can derive the variance. August 19, / 142

107 Moments of the Number of Events that Occur Question 70 A random graph is generated on n vertices where each edge has probability p of being in the graph. What is (a) The expected number of triangles? (b) The variance of the number of triangles? [Hint: there are two types of pairs of triangles to consider those that share an edge, and those that don t.] August 19, / 142

108 Moments of the Number of Events that Occur Question 71 An ecologist is collecting butterflies. She collects a total of n butterflies. There are a total of N species of butterfly, and each butterfly is equally likely to be any of the species. (a) What is the expected number of species of butterflies she collects? (b) What is the variance of the number of species of butterflies she collects? August 19, / 142

109 Covariance, Variance of Sums and Correlation Covariance, Variance of Sums and Correlation If X and Y are independent, then for any functions h and g, E[g(X)h(Y )] = E[g(X)]E[h(Y )] Covariance between X and Y is defined as Cov(X, Y ) = E[(X E[X])(Y E[Y ])] = E[XY ] E[X]E[Y ] Properties of covariance: Cov(X, Y ) = Cov(Y, X) Cov(X, X) = Var(X) Cov(aX, Y ) = acov(x, Y ) Cov( n i=1 X i, m j=1 Y j) = n m i=1 j=1 Cov(X i, Y j ) Var( n i=1 X i) = n i=1 Var(X i) + 2 i<j Cov(X i, X j ) The correlation of two random variables X and Y is defined as ρ(x, Y ) = Cov(X,Y ) Var(X)Var(Y ) August 19, / 142

110 Covariance, Variance of Sums and Correlation Question 72 Show that Cov(X, Y ) Var(X) Var(Y ). [Hint: consider Var(λX βy ) for suitable λ and β.] August 19, / 142

111 Conditional Expectation Conditional Expectation: definitions If X and Y are jointly discrete random variables, the conditional pmf of X given Y = y is P X Y (x y) = p(x,y) p Y (y), thus conditional expectation of X given Y = y, for y such that p Y (y) > 0, is E[X Y = y] = x xp X Y (x y) If X and Y are jointly continuous random variables, the conditional pdf of X given Y = y is P X Y (x y) = p(x,y) p Y (y), thus conditional expectation of X given Y = y, for y such that f Y (y) > 0, is E[X Y = y] = xf X Y (x y)dx Conditional expectations satisfy the properties of ordinary expectations. August 19, / 142

112 Conditional Expectation Conditional Expectations by conditioning E[X Y ] at Y = y is E[X Y = y], and this is a random variable. A very important property: E[X] = E[E[X Y ]] If Y is a discrete random variable, this is E[X] = y E[X Y = y]p(y = y) If Y is a continuous random variable with pdf f Y (y), this is E[X] = E[X Y = y]f Y (y)dy August 19, / 142

113 Conditional Expectation Question 73 Consider the following experiment: Roll a fair (6-sided) die. If the result n is odd, toss n fair coins and count the number of heads. If n is even, roll n fair dice and take the sum of the numbers. What is the expected outcome? August 19, / 142

114 Conditional Expectation Question 74 A company has 50 employees, and handles a number of projects. The time a project takes (in days) follows an exponential random variable k with parameter 100, where k is the number of employees working on that project. The company receives m projects simultaneously, and has no other projects at that time. What is the expected time until completion of a randomly chosen project if: (a) The company divides its workers equally between the m projects, and starts all projects at once? [Ignore any requirements that the number of employees working on a given project should be an integer.] (b) The company orders the projects at random, and assigns all its workers to project 1, then when project 1 is finished, assigns all its workers to project 2, and so on? August 19, / 142

115 Conditional Expectation Question 75 You are considering an investment. The initial investment is $1,000, and every year, the investment increases by 50% with probability 0.4, decreases by 10% with probability 0.3, or decreases by 40% with probability 0.3. Let X n denote the value of the investment after n years [so X 0 = 1000]. (a) Calculate E(X n X n 1 = x). (b) Calculate E(X 10 ). (c) What is the probability that the investment is actually worth the amount you calculated in part (b), or more? August 19, / 142

116 Conditional Expectation Conditional variance Conditional variance of X given Y = y is Var(X Y ) = E[X E[X Y ]) 2 Y ] Var(X Y ) = E[X 2 Y ] (E[X Y ]) 2 Var(X) = E[Var(X Y )] + Var(E[X Y ]) August 19, / 142

117 Conditional Expectation Question 76 An insurance company finds that of its home insurance policies, only 0.5% result in a claim. Of the policies that result in a claim, the expected amount claimed is $40,000, and the standard deviation of the amount claimed is $100,000. What are the expected amount claimed, and the variance of the amount claimed for any policy? [If a policy does not result in a claim, the amount claimed is $0. Hint: for the variance, try to work out E(X 2 ).] August 19, / 142

118 Conditional Expectation Question 77 An outbreak of a contagious disease starts with a single infected person. Each infected person i infects N i other people, where each N i is a random variable with expectation µ < 1. If the N i are all independent and identically distributed, what is the expectation of the total number of people infected? August 19, / 142

119 Conditional Expectation and Prediction Conditional Expectation and Prediction If X is observed to equal x, g(x) is our prediction for the value of Y, one criterion for closeness of choosing g so that g(x) tends to be close to Y is to minimize E[(Y g(x)) 2 ]. Since E[(Y g(x)) 2 ] E[(Y E[Y X]) 2 ] Thus the best possible predictor of Y is g(x) = E[Y X]. August 19, / 142

120 Moment Generating Functions Moment Generating Functions The moment generating function M(t) of RV X is defined by M(t) = E[e tx ] M (t) = de[etx ] dt = E[ d(etx ) ] = E[Xe tx ] dt Thus we have M (0) = E[X], M (0) = E[X 2 ],, M n (0) = E[X n ]. If X and Y are independent, M X+Y (t) = M X (t)m Y (t). The moment generating function, if exists and is finite in some region about t = 0, uniquely determines the distribution. August 19, / 142

121 Moment Generating Functions Question 78 If X is exponentially distributed with parameter λ and Y is uniformly distributed on the interval [a, b], what is the moment generating function of X + 2Y? August 19, / 142

122 Moment Generating Functions Question 79 If X is exponentially distributed with parameter λ 1 and Y is exponentially distributed with parameter λ 2, where λ 2 > λ 1 : (a) find the moment generating function of X Y. (b) [Chapter 8] Use the Chernoff bound with t = λ 1 λ 2 2 to obtain a lower bound on the probability that X > Y. August 19, / 142

123 Additional Properties of Normal Random Variables Joint Moment Generating Functions The joint moment generating function of n RVs X 1,, X n is defined as M(t 1,, t n ) = E[e t 1X 1 + +t nx n ] Thus M Xi (t) = E[e tx i ] = M(0,, 0, t, 0,, 0) The joint moment generating function uniquely determines the joint distribution of X 1,, X n. If n RVs X 1,, X n are independent if and only if M(t 1,, t n ) = M X1 (t 1 ) M xn (t n ) August 19, / 142

124 Additional Properties of Normal Random Variables Multivariate Normal distribution Definition: Let Z 1,, Z n be a set of n independent standard Normal RVs, if X 1 = a 11 Z a 1n Z n + µ 1 X 2 = a 21 Z a 2n Z n + µ 2. X m = a m1 Z a mn Z n + µ m Then the RVs X 1,, X m are said to have a multivariate normal distribution. The joint distribution of X 1,, X m is completely determined by the mean and covariance of X 1,, X m. August 19, / 142

125 Additional Properties of Normal Random Variables Question 80 Let X, Y have a multivariate normal distribution, such that X has mean 2 and standard deviation 1; Y has mean 4 and standard deviation 2; and Cov(X, Y ) = 2. What is P(Y > X)? August 19, / 142

126 Markov s Inequality, Chebyshev s Inequality and the Weak Law of Large Numbers Markov s Inequality, Chebyshev s Inequality Markov s Inequality: If X is a random variable that takes only nonnegative values, then for any value a > 0, P(X a) E[X] a Chebyshev s Inequality: If X is a random variable with finite mean µ and variance σ 2, then for any value k > 0 P( X µ k) σ2 k 2 If Var(X) = 0, then P(X = E(X)) = 1 August 19, / 142

127 Markov s Inequality, Chebyshev s Inequality and the Weak Law of Large Numbers The Weak Law of Large Numbers Let X 1, X 2, be a sequence of independent and identically distributed random variables, each having finite mean E[X i ] = µ, then, for any ɛ > 0, P{ X X n µ ɛ} 0 n as n. August 19, / 142

128 Markov s Inequality, Chebyshev s Inequality and the Weak Law of Large Numbers The Strong Law of Large Numbers Let X 1, X 2, be a sequence of independent and identically distributed random variables, each having a finite mean E[X i ] = µ, then, with probability 1, X X n µ n as n. August 19, / 142

129 Markov s Inequality, Chebyshev s Inequality and the Weak Law of Large Numbers Question 81 Use Markov s inequality to obtain a bound on the probability that the product of two independent exponentially distributed random variables with parameters λ and µ is at least 1. August 19, / 142

130 Question 69 [Section 7] In a particular area in a city, the roads follow a rectangular grid pattern. You want to get from a location A to a location B that is m blocks East and n blocks North of A. (b) After a storm, each road segment independently has probability p of being flooded. What is the expected number of shortest paths from A to B that are still useable? Answer The expected number of useable shortest paths is: ( ) m + n (1 p) m+n m Question 69 (Continued) (d) Use Markov s inequality to find a bound on the probability that there is still a useable shortest path from A to B. August 19, / 142

131 Markov s Inequality, Chebyshev s Inequality and the Weak Law of Large Numbers Question 82 X is uniformly distributed on [3, 8]. Y is a positive random variable with expectation 3. X and Y are independent. Obtain an upper bound on the probability that XY 20. (a) By using Markov s inequality on XY. (b) By conditioning on X = x and using Markov s inequality on Y to bound P ( Y 20 ) x, then find the bound on probability that XY 20. (c) [bonus] Without any assumptions on the independence of X and Y, find an upper bound on P(XY 20). [Condition on the value of X, and consider the conditional expectation E(Y X).] August 19, / 142

132 Central Limit Theorem Central Limit Theorem Let X 1, X 2, be a sequence of independent and identically distributed random variables, each having mean µ and variance σ 2, then the distribution of X X n nµ σ n tends to the standard normal as n. August 19, / 142

133 Central Limit Theorem Question 83 A car insurance company is trying to decide what premium to charge. It estimates that it can attract 20,000 customers. Each customer has a probability of making a claim for $3,000. The company has to cover $1,000,000 in costs. What premium should it set in order to have a probability of being unable to cover its costs? August 19, / 142

134 Central Limit Theorem Question 84 An insurance company insures houses. They find that the expected value of the total amount claimed by an insured household each year is $500, and the standard deviation is $3,000. The company charges an annual premium of $600 for each household. Of this premium, $60 is needed to cover the company s costs. (So the remaining $540 can be used to cover claims.) (a) If the company insures 50,000 homes, what is the probability that it makes a loss in a given year? (That is, the probability that it must pay out more in claims and costs than it collects in premiums.) (b) What important assumptions have you made in part (a)? How reasonable are these assumptions? (c) If the company wants to reduce this probability to less than , how many houses would it need to insure at the current premium? August 19, / 142

135 Central Limit Theorem Question 85 A gambler is playing roulette in a casino. The gambler continues to bet $1 on the number where the ball will land. There are 37 numbers. If the gambler guesses correctly, the casino pays him $35 (and he keeps his $1). Otherwise, he loses his $1. If he keeps playing, how long is it before the probability that he has more money than he started with is less than ? August 19, / 142

136 Central Limit Theorem Question 86 A civil servant is conducting a survey to determine whether people support a proposed policy. They plan to survey 200 people. Suppose that in fact 48% of people support the proposed policy, and that the people surveyed are chosen at random, so that each person surveyed independently supports the policy with probability (a) What is the probability that the survey leads them to wrongly conclude that a majority of people support the policy, i.e. at least 101 people in the survey support the policy. [You may use any reasonable approximations for the distribution of the number of people in the survey that support the policy.] (b) How many people would they need to survey to have a 99% chance of reaching the correct conclusion (which happens if strictly less than half of the people in the survey support the policy). August 19, / 142

137 Other Inequalities Other Inequalities One-sided Chebyshev s Inequality: If X is a random variable with mean 0 and finite variance σ 2, then for any a > 0 P(X a) σ2 σ 2 + a 2 Corollary: If E[X] = µ and var(x) = σ 2, then for a > 0, P(X µ + a) P(X µ a) σ2 σ 2 + a 2 σ2 σ 2 + a 2 August 19, / 142

138 Other Inequalities Other Inequalities Chernoff bounds: P(X a) e ta M(t) for all t > 0 P(X a) e ta M(t) for all t < 0 Jensen s inequality: If f (x) is a convex function, then E[f (X)] f (E[X]) August 19, / 142

139 Other Inequalities Question 79 (Section 7) If X is exponentially distributed with parameter λ 1 and Y is exponentially distributed with parameter λ 2, where λ 2 > λ 1 : (a) find the moment generating function of X Y. Answer The moment generating function is: M X Y (t) = λ 1 λ 2 (λ 1 t)(λ 2 + t) Question (continued) (b) [Chapter 8] Use the Chernoff bound with t = λ 1 λ 2 2 to obtain a lower bound on the probability that X > Y. August 19, / 142

140 Question 70 [Section 7] Other Inequalities A random graph is generated on 10 vertices where each edge has probability 0.2 of being in the graph. What is (a) The expected number of triangles? (b) The variance of the number of triangles? [Hint: there are two types of pairs of triangles to consider those that share an edge, and those that don t.] Answer (a) 0.96 (b) Question (continued) (c) Use the one-sided Chebyshev inequality to find a bound on the probability that there are no triangles in the graph. August 19, / 142

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Probability and Probability Distributions. Dr. Mohammed Alahmed

Probability and Probability Distributions. Dr. Mohammed Alahmed Probability and Probability Distributions 1 Probability and Probability Distributions Usually we want to do more with data than just describing them! We might want to test certain specific inferences about

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Organization 2/47 7 lectures (lecture of May 12 is canceled) Studyguide available (with notes, slides, assignments, references), see http://www.win.tue.nl/

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Lecture 16. Lectures 1-15 Review

Lecture 16. Lectures 1-15 Review 18.440: Lecture 16 Lectures 1-15 Review Scott Sheffield MIT 1 Outline Counting tricks and basic principles of probability Discrete random variables 2 Outline Counting tricks and basic principles of probability

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Probability and Statistics

Probability and Statistics Probability and Statistics Jane Bae Stanford University hjbae@stanford.edu September 16, 2014 Jane Bae (Stanford) Probability and Statistics September 16, 2014 1 / 35 Overview 1 Probability Concepts Probability

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

Chapter 4 : Discrete Random Variables

Chapter 4 : Discrete Random Variables STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2015 Néhémy Lim Chapter 4 : Discrete Random Variables 1 Random variables Objectives of this section. To learn the formal definition of a random variable.

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2014 Introduction The markets can be thought of as a complex interaction of a large number of random

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

PRACTICE PROBLEMS FOR EXAM 2

PRACTICE PROBLEMS FOR EXAM 2 PRACTICE PROBLEMS FOR EXAM 2 Math 3160Q Fall 2015 Professor Hohn Below is a list of practice questions for Exam 2. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

CSE 312 Final Review: Section AA

CSE 312 Final Review: Section AA CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material

More information

Probabilistic models

Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

lecture notes October 22, Probability Theory

lecture notes October 22, Probability Theory 8.30 lecture notes October 22, 203 Probability Theory Lecturer: Michel Goemans These notes cover the basic definitions of discrete probability theory, and then present some results including Bayes rule,

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss? Chapter 4 Mathematical Expectation 4.1 Expected value of a discrete random variable Examples: (a) Given 1, 2, 3, 4, 5, 6. What is the average? (b) Toss an unbalanced die 1 times. Lands on 1 2 3 4 5 6 Probab..1.3.1.2.2.1

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Discrete Probability Refresher

Discrete Probability Refresher ECE 1502 Information Theory Discrete Probability Refresher F. R. Kschischang Dept. of Electrical and Computer Engineering University of Toronto January 13, 1999 revised January 11, 2006 Probability theory

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Examples of random experiment (a) Random experiment BASIC EXPERIMENT Random experiment A random experiment is a process leading to an uncertain outcome, before the experiment is run We usually assume that the experiment can be repeated indefinitely under essentially the

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Exam 1 - Math Solutions

Exam 1 - Math Solutions Exam 1 - Math 3200 - Solutions Spring 2013 1. Without actually expanding, find the coefficient of x y 2 z 3 in the expansion of (2x y z) 6. (A) 120 (B) 60 (C) 30 (D) 20 (E) 10 (F) 10 (G) 20 (H) 30 (I)

More information

Two characteristic properties of a random experiment:

Two characteristic properties of a random experiment: 1 Lecture 1 1.1 Probability as Relative Frequency A single coin flipping experiment. Observe a coin flipped many times. What probability we can prescribe to the event that a head turns up? We expect the

More information

Elementary Discrete Probability

Elementary Discrete Probability Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,

More information

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Sta230/Mth230 Colin Rundel February 5, 2014 We have been using them for a while now in a variety of forms but it is good to explicitly define what we mean Random

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

STAT 430/510: Lecture 10

STAT 430/510: Lecture 10 STAT 430/510: Lecture 10 James Piette June 9, 2010 Updates HW2 is due today! Pick up your HW1 s up in stat dept. There is a box located right when you enter that is labeled "Stat 430 HW1". It ll be out

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

Probability 1 (MATH 11300) lecture slides

Probability 1 (MATH 11300) lecture slides Probability 1 (MATH 11300) lecture slides Márton Balázs School of Mathematics University of Bristol Autumn, 2015 December 16, 2015 To know... http://www.maths.bris.ac.uk/ mb13434/prob1/ m.balazs@bristol.ac.uk

More information

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events... Probability COMP 245 STATISTICS Dr N A Heard Contents Sample Spaces and Events. Sample Spaces........................................2 Events........................................... 2.3 Combinations

More information

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra

More information