CS1800 Probability 3: Bayes Rule. Professor Kevin Gold

Size: px
Start display at page:

Download "CS1800 Probability 3: Bayes Rule. Professor Kevin Gold"

Transcription

1 CS1800 Probability 3: Bayes Rule Professor Kevin Gold

2 Brief Review of Probability - Counting Outcomes and Axioms Recall from before that we can calculate probabilities by counting success outcomes and dividing by the total number of outcomes. Chance that a deck of 5 cards numbered 1-5 ends up in order after shuffling: 1/5! = 1/120 Chance that it s in order or reverse order: 2/120 = 1/60 Much of the discussion today will just take some probabilities as a given, like you think there s a 1% chance your friend is lying. This is fine too, even though we re not counting anything, as long as we obey the axioms of probability: Number in range [0,1], generally representing our degree of belief Union of all outcomes has probability 1 Pr(A v B) = Pr(A) + Pr(B) - Pr(A ^ B)

3 Brief Review of Probability - Independence Events are independent if learning the outcome of one, does not affect our belief about the other Two rolls of a die, events happening far from each other The definition of independence also tells us something useful we can do: Pr(A ^ B) = Pr(A)Pr(B) for all outcomes of A and B iff A and B are independent Probability that two rolls of a 6-sided die are both 6: (1/6)*(1/6) = 1/36, same result as counting 1 success (6,6) out of 36 roll pairs Pr(6-sided die is even) = 1/2 and Pr(6-sided die is odd) = 1/2, but Pr(particular 6-sided die roll is both even and odd) = 0, so those events are not independent (if they re about the same roll)

4 Brief Review of Probability - Conditional Probability Pr(A B) is the probability that event A happens given that we know B is true Pr(rains today dark clouds in the sky) > Pr(rains today) The definition is Pr(A B) = Pr(A ^ B)/Pr(B) We re shrinking the sample space to include just outcomes where B is true Pr(exactly 2 out of 3 coin flips heads at least one head flipped) = (C(2,3)/2 3 )/(1-(1/2) 3 ) = (3/8)/(7/8) = 3/7 Which we could also get by observing actual at-least-one-h outcomes: TTH, THT, HTT, THH, HTH, THH, HHH From the definition, it follows that Pr(A ^ B) = Pr(A B)Pr(B), which is a generalization of the multiplication rule used for independent events Pr (rains today ^ dark clouds) = Pr(dark clouds)pr(rains today dark clouds) We can calculate the probability of both events happening even though they aren t independent, as long as we know the conditional probability

5 The Uses of Bayes Rule Bayes Rule or Bayes Theorem is a powerful mathematical tool that allows you to decide which of several explanations for some data is best It explains how to combine: your prior degree of belief in each hypothesis (probabilities) some evidence a model of how likely it is that each hypothesis would produce the available evidence (conditional probabilities) And arrive at a posterior likelihood of each hypothesis, in light of the data

6 Specific Applications of Bayes Rule Medical Diagnosis Speech Recognition Evidence: Symptoms Hypotheses: Underlying diseases Evidence: Sound signal Hypotheses: Words

7 Specific Applications of Bayes Rule Robot Navigation Political Polling Evidence: Local images and depth camera results Hypotheses: Locations Evidence: Multiple polls Hypotheses: True underlying likely-voter counts

8 Bayes Rule in Lay Terms The likelihood of the hypothesis in light of the evidence is proportional to two factors: Our prior degree of belief in the hypothesis, before we obtained the evidence The likelihood that we would see this evidence if the hypothesis were true For each competing hypothesis, we can multiply these two factors and compare the results to determine which hypothesis is the most likely explanation.

9 Bayes Rule as a Formula The likelihood of the hypothesis in light of the evidence is proportional to two factors: Our prior degree of belief in the hypothesis, before we obtained the evidence The likelihood that we would see this evidence if the hypothesis were true For each competing hypothesis, we can multiply these two factors and compare the results to determine which hypothesis is the most likely explanation. Pr(hypothesis evidence) proportional to Pr(hypothesis)*Pr(evidence hypothesis) prior likelihood of evidence

10 Bayes Rule as an Equality The proportional to symbol means that the left side is equal to the right, multiplied by some constant. This constant is the same for all hypotheses, which lets us compare the results of the prior*likelihood calculation across different hypotheses. Pr(hypothesis evidence) = αpr(hypothesis)*pr(evidence hypothesis)

11 Bayes Rule as an Equality We can be more precise about the equation and remove the propotional to we know what the scaling factor is (it s 1/Pr(evidence)) But for many practical purposes, we don t know this factor at first, and we may not need to calculate it exactly Pr(hypothesis evidence) = Pr(hypothesis)*Pr(evidence hypothesis) Pr(evidence)

12 Bayes Rule as an Abstraction Lastly, we don t particularly need to talk about hypotheses and evidence Bayes Rule is true generally However, replacing everything with A s and B s makes it much less clear what we re usually trying to do Pr(A B) = Pr(A)*Pr(B A) Pr(B)

13 A Quick Proof of the Abstract Version of Bayes Rule Bayes rule follows naturally from the definition of conditional probability. Pr(A ^ B) = Pr(A B) Pr(B) = Pr(B A)Pr(A) by the definition of conditional probability Drop the Pr(A ^ B) and divide both sides by Pr(B) to get Pr(A B) = Pr(B A)Pr(A) / Pr(B) (which is Bayes rule)

14 Rewind a Bit I think the best balance of being memorable and accurate is the second way: the posterior Pr(hypothesis evidence) is proportional to Pr(hypothesis)*Pr(evidence hypothesis) the prior times the likelihood of the evidence (the first probability flipped )

15 Using Bayes Rule: A Basic Example I just rolled a die that was drawn at random from a bag containing an equal number of 6-sided dice (numbered 1-6) and 20-sided dice (numbered 1-20). I tell you the die roll is a 6. Which hypothesis is more likely that it s 6-sided, or 20-sided? Pr( 6 6-sided die) = 1/6 Pr( 6 20-sided die) = 1/20 Pr(6-sided die 6 ) Pr( 6 6-sided)Pr(6-sided) = 1/6 * 1/2 = 1/12 Pr(20-sided 6 ) Pr( 6 20-sided)Pr(20-sided) = 1/20*1/2 = 1/40 The die is more likely to be 6-sided since 1/12 > 1/40.

16 Changing the Prior Suppose that the bag contained 4 20-sided dice and just 2 6-sided dice before I drew a die and rolled a 6. Which die is more likely now? Pr( 6 6-sided die) = 1/6 Pr( 6 20-sided die) = 1/20 Pr(6-sided die 6 ) Pr( 6 6-sided)Pr(6-sided) = 1/6 * 1/3 = 1/18 Pr(20-sided 6 ) Pr( 6 20-sided)Pr(20-sided) = 1/20*2/3 = 1/30 Despite the larger number of 20-sided dice, the die is still more likely to be 6-sided due to how much more unlikely a 6 is on a 20-sided die.

17 Changing the Prior, II Suppose that the bag contained 9 20-sided dice and just 1 6-sided die before I drew a die and rolled a 6. Which die is more likely now? Pr( 6 6-sided die) = 1/6 Pr( 6 20-sided die) = 1/20 Pr(6-sided die 6 ) Pr( 6 6-sided)Pr(6-sided) = 1/6 * 1/10 = 1/60 = Pr(20-sided 6 ) Pr( 6 20-sided)Pr(20-sided) = 1/20*9/10 = 9/200 = With enough 20-sided dice to begin with, it finally becomes more likely that the die was actually 20-sided, despite the evidence.

18 Finding Exact Probabilities The numbers we derived tell us which hypothesis is most likely, but they are not the true probabilities. We need to divide each quantity by Pr(evidence) to get the true probabilities Pr(evidence) = Pr(evidence ^ hypothesis1) + Pr(evidence ^ hypothesis2) + + Pr(evidence ^ hypothesisn) = Pr(evidence hypothesis1)pr(hypothesis1) + Pr(evidence hypothesis2)pr(hypothesis2) + Pr(evidence hypothesisn)pr(hypothesisn) (this assumes we ve exhausted the possible explanations for the evidence) these are the numbers we calculated before! In other words, we need to divide each of our results by the sum of the results to get the true probabilities

19 Finding Exact Probabilities - Example I just rolled a die that was drawn at random from a bag containing an equal number of 6-sided dice (numbered 1-6) and 20-sided dice (numbered 1-20). I tell you the die roll is a 6. What is the probability of each hypothesis in light of the evidence (6-sided vs 20-sided)? Pr(6-sided die 6 ) Pr( 6 6-sided)Pr(6-sided) = 1/6 * 1/2 = 1/12 Pr(20-sided 6 ) Pr( 6 20-sided)Pr(20-sided) = 1/20*1/2 = 1/40 Likelihood of the evidence (overall chance of a 6) = Pr( 6 6-sided)Pr(6-sided) + Pr( 6 20-sided)Pr(20-sided) = 1/12 + 1/40 Pr(6-sided die 6 ) = 1/12/(1/12 + 1/40) = 0.76 Pr(20-sided die 6 ) = 1/40/(1/12 + 1/40) = 0.24 Notice how dividing by the sum forces the actual probabilities to sum to 1.

20 A Science-Themed Example Bayes rule can be used in scientific applications to balance prior knowledge with new observations. Scientists estimate that an asteroid is 60% likely to be composition A, and 40% likely to be composition B. Then they get some readings that would have had a 20% probability of being generated under composition A, and a 50% probability of being generated under composition B. Which hypothesis is more likely now? Pr(A readings) Pr(readings A) Pr(A) = 0.2 * 0.6 = 0.12 Pr(B readings) Pr(readings B) Pr(B) = 0.5 * 0.4 = 0.2 So hypothesis B is more likely - the new readings shifted our uncertain opinion Specifically, we should think Pr(B readings) = 0.2/( ) = 20/32 = (still pretty uncertain)

21 This Really Happened the Other Day I was playing a board game with friends (Gloomhaven) when a friend drew a curse card. I m not sure whether I shuffled since adding this to the [top of the] deck, my friend said. There were 20 other cards in the deck (all not curse cards), and I trust my friend to be honest about his uncertainty. Pr(no shuffle) Pr(curse no shuffle)pr(no shuffle) = 1*0.5 = 0.5 Pr(shuffle) Pr(curse shuffle)pr(shuffle) = 1/21*0.5 = I m pretty sure you didn t shuffle, then, I said. If I d actually done the math, I could have 0.5/( ) = 95% confidence.

22 A Practice Problem A friend flips a coin to determine who will pay for lunch, and it comes up heads you pay. The first time this happens, it seems normal. But the coin comes up heads 8 times in a row. Before, you would have said there was only a 1% chance that your friend would use a two-headed coin. But what should the probability be in light of all these heads results? Bayes Rule reminder: Pr(hypothesis evidence) Pr(evidence hypothesis)pr(hypothesis)

23 A Practice Problem A friend flips a coin to determine who will pay for lunch, and it comes up heads you pay. The first time this happens, it seems normal. But the coin comes up heads 8 times in a row. Before, you would have said there was only a 1% chance that your friend would use a two-headed coin. But what s the more likely hypothesis now? What are the probabilities? Pr(two-headed results) Pr(results two-headed)pr(two-headed) = 1*0.01 = 0.01 Pr(not two-headed results) Pr(results not two-headed)pr(not two-headed) = 1/256*0.99 = It s now more likely that the coin is two-headed. Pr(two-headed results) = 0.01/( ) = 0.72

24 Bayes Rule Tells Us It s Hard to Detect Rare Events Confidently Bayes rule tells us to pay attention to not just the immediate evidence, but base rates of events (our prior belief before receiving evidence) Our methods for detecting rare events rare diseases, terrorists, etc. all have false positive rates where false alarms happen. Combined with Bayes rule, our prior belief that the event is rare should make it extremely difficult for an even slightly unreliable technology to convince us otherwise Bayes rule tells us that most detections of rare events will be false positives, unless the method being used is extremely accurate. General Population F a l s e P o s i t i v e bad stuff

25 Security Example Suppose we have an airport face-detection based security measure for catching terrorists that has a 99% chance of going boop if a terrorist passes through, but has a 2% chance of going boop on a regular person. Suppose that 1 in 10,000 people is a terrorist. Our detector is going boop - what is the chance that it s a false alarm? Pr (terrorist boop) P(boop terrorist) P(terrorist) = 0.99 * = Pr (not terrorist boop) P(boop not terrorist) P(not terrorist) = 0.02 * = basically / = or 99.5% chance the boop is a false alarm.

26 A Real Example of the Difficulty of Reliable Detection From Nate Silver s The Signal and the Noise (p. 245): Studies show that if a woman does not have cancer, a mammogram will incorrectly claim that she does only about 10 percent of the time. If she does have cancer, on the other hand, they will detect it about 75 percent of the time. When you see those statistics, a positive mammogram seems like very bad news indeed. But if you apply Bayes s theorem to these numbers, you ll come to a different conclusion: the chance that a woman in her forties has breast cancer given that she s had a positive mammogram is still only about 10 percent.for this reason, many doctors recommend that women do not begin getting regular mammograms until they are in their fifties and the prior probability of having breast cancer is higher.

27 Updating Beliefs With More Evidence We can sometimes get around the problem of strong priors by incorporating multiple pieces of evidence. As long as the pieces of evidence are conditionally independent (independent besides their shared dependence on the hypothesis), we can treat the probabilities derived after one Bayesian calculation as our new prior going forward. This approach allows scientists to incorporate multiple experimental results, robots to combine observations across different sensors, and law enforcement to use multiple lines of evidence

28 Updating Beliefs Example Besides the face recognition software, suppose we also employ voice recognition software that has a 1% false positive rate and 5% false negative rate - and it goes bing (positive) on our suspect from before From the face recognition results we have new priors of Pr(normal) = 0.995, Pr(terrorist) = Pr(terrorist bing) Pr(bing terrorist)pr(terrorist) = 0.95*0.005 = Pr(not terrorist bing) Pr(bing not terrorist)pr(not terrorist) = 0.01*0.995 = Pr(terrorist bing) = /( ) = 0.32 or 32%, which maybe at least makes this worth investigating further

29 Nate Silver and Being Bayesian Nate Silver, founder of fivethirtyeight.com, rose to fame in 2008 for successfully predicting 49 of 50 states outcomes in that presidential election He s argued that his primary innovation was simply being Bayesian about the polls - instead of choosing one poll to believe, he would constantly update his belief with the evidence of each new poll His book The Signal and the Noise is all about how different scientific fields seem to be progressing or not, depending on how thoroughly they ve adopted Bayesian methods

30 A Quote from The Signal and the Noise (p. 452) Bayes s theorem encourages us to be disciplined about how we weigh new information.most of the time, we do not appreciate how noisy the data is, and so our bias is to place too much weight on the newest data point. But we can have the opposite bias when we become too personally or professionally invested in a problem, failing to change our minds when the facts do The more often you are willing to test your ideas, the sooner you can begin to avoid these problems and learn from your mistakes.

31 Application to the Monty Hall Problem Speaking of being too attached to priors: here s a classic probability problem that seems slightly paradoxical, named after the host of the show Let s Make a Deal A game show host offers you the choice of three doors. Behind one is a new car you d like to win. Behind the other two doors are goats. The host lets you pick a door, which you basically do at random. Then the host says, I m going to choose a door that you didn t pick, and show you what s behind that door. He opens a door and reveals a goat. (You re certain he would not reveal the car at this stage, and that he must reveal a door you did not pick.) pick Knowing what you know now, would you like to switch doors?

32 Bayes and the Monty Hall Problem pick For the sake of clarity, let s assume (without loss of generality) we picked door 2 and the host opened door 3. There are now only two valid hypotheses - car behind 1, car behind 2. Our evidence is that door 3 was opened. What was the likelihood door 3 was opened if 2 had the car, and we are right? What was the likelihood door 3 was opened if 1 had the car, and we were wrong?

33 Bayes and the Monty Hall Problem Door 1 Has a Car World can t reveal can Door 2 Has a Car World can can t can pick pick What was the likelihood door 3 was opened if 2 had the car, and we are right? 1/2 since the host could open either door 1 or door 3 with equal likelhood What was the likelihood door 3 was opened if 1 had the car, and we were wrong? 1 since the host is forced to open the non-pick, non-car door

34 Bayes and the Monty Hall Problem pick Pr(door 1 has car door 3 revealed) Pr(door 3 revealed door 1 car)pr(door 1 car) =1*1/3 = 1/3 Pr(door 2 has car door 3 revealed) Pr(door 3 revealed door 2 has car)pr(door 2 car) = 1/2*1/3 = 1/6 So in fact, door 1 is the more likely hypothesis (with probability (1/3)/(1/3 + 1/6) = 2/3), and we should switch

35 Monty Hall Without Bayes Rule Bayes rule is just a convenient shortcut when reasoning about conditional probabilities We could see the same results by mapping out all the possibilities car door your pick switch loses switch wins switch wins switch wins switch loses switch wins switch wins switch wins switch loses

36 Bayes is Old News But Also Currently Hot Bayes original work appeared in 1763, two years after his death Laplace, competitor with Newton for the development of calculus, developed the theory further (1774) But when statistics took off, its inventors emphasized the importance of a single experiment, and didn t like the idea of incorporating priors (especially Fisher, ) Non-Bayesian frequentist statistics is still the most common kind taught in undergraduate curricula, though Bayesian statistics is gaining traction Artificial intelligence got a major boost in the 80 s thanks to the efforts of Judea Pearl, who showed how to use Bayes rule to incorporate evidence over time and across sources of evidence Sebastian Thrun demonstrated how useful Bayesian reasoning could be to self-driving cars in the 2005 DARPA Grand Challenge; Bayesian reasoning drives Google s self-driving cars today

37 Summary Bayes rule tells us that the likelihood of a hypothesis in light of the evidence is proportional to the product of: our prior probability of the hypothesis the likelihood of the evidence if the hypothesis is true In other words, Pr(hypothesis evidence) Pr(evidence hypothesis)pr(hypothesis) Given a complete set of hypotheses, we can calculate their probabilities by dividing these results by their sum, thus scaling them to sum to 1 Bayes rule is a useful way to take into account everything we know and believe, including prior beliefs, base rates of uncommon events, and experiment results

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

P (E) = P (A 1 )P (A 2 )... P (A n ).

P (E) = P (A 1 )P (A 2 )... P (A n ). Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer

More information

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped

More information

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary) Chapter 14 From Randomness to Probability How to measure a likelihood of an event? How likely is it to answer correctly one out of two true-false questions on a quiz? Is it more, less, or equally likely

More information

Discrete Probability and State Estimation

Discrete Probability and State Estimation 6.01, Fall Semester, 2007 Lecture 12 Notes 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.01 Introduction to EECS I Fall Semester, 2007 Lecture 12 Notes

More information

Joint, Conditional, & Marginal Probabilities

Joint, Conditional, & Marginal Probabilities Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how to create probabilities for combined events such as P [A B] or for the likelihood of an event A given that

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events

More information

Lecture 3. January 7, () Lecture 3 January 7, / 35

Lecture 3. January 7, () Lecture 3 January 7, / 35 Lecture 3 January 7, 2013 () Lecture 3 January 7, 2013 1 / 35 Outline This week s lecture: Fast review of last week s lecture: Conditional probability. Partition, Partition theorem. Bayes theorem and its

More information

First Digit Tally Marks Final Count

First Digit Tally Marks Final Count Benford Test () Imagine that you are a forensic accountant, presented with the two data sets on this sheet of paper (front and back). Which of the two sets should be investigated further? Why? () () ()

More information

Grades 7 & 8, Math Circles 24/25/26 October, Probability

Grades 7 & 8, Math Circles 24/25/26 October, Probability Faculty of Mathematics Waterloo, Ontario NL 3G1 Centre for Education in Mathematics and Computing Grades 7 & 8, Math Circles 4/5/6 October, 017 Probability Introduction Probability is a measure of how

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

Intermediate Math Circles November 15, 2017 Probability III

Intermediate Math Circles November 15, 2017 Probability III Intermediate Math Circles November 5, 07 Probability III Example : You have bins in which there are coloured balls. The balls are identical except for their colours. The contents of the containers are:

More information

Discrete Probability and State Estimation

Discrete Probability and State Estimation 6.01, Spring Semester, 2008 Week 12 Course Notes 1 MASSACHVSETTS INSTITVTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.01 Introduction to EECS I Spring Semester, 2008 Week

More information

6.01: Introduction to EECS I. Discrete Probability and State Estimation

6.01: Introduction to EECS I. Discrete Probability and State Estimation 6.01: Introduction to EECS I Discrete Probability and State Estimation April 12, 2011 Midterm Examination #2 Time: Location: Tonight, April 12, 7:30 pm to 9:30 pm Walker Memorial (if last name starts with

More information

Probability, Statistics, and Bayes Theorem Session 3

Probability, Statistics, and Bayes Theorem Session 3 Probability, Statistics, and Bayes Theorem Session 3 1 Introduction Now that we know what Bayes Theorem is, we want to explore some of the ways that it can be used in real-life situations. Often the results

More information

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G. CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017.

More information

COMP61011! Probabilistic Classifiers! Part 1, Bayes Theorem!

COMP61011! Probabilistic Classifiers! Part 1, Bayes Theorem! COMP61011 Probabilistic Classifiers Part 1, Bayes Theorem Reverend Thomas Bayes, 1702-1761 p ( T W ) W T ) T ) W ) Bayes Theorem forms the backbone of the past 20 years of ML research into probabilistic

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 12, 2018 CS 361: Probability & Statistics Random Variables Monty hall problem Recall the setup, there are 3 doors, behind two of them are indistinguishable goats, behind one is a car. You pick

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Great Theoretical Ideas in Computer Science

Great Theoretical Ideas in Computer Science 15-251 Great Theoretical Ideas in Computer Science Probability Theory: Counting in Terms of Proportions Lecture 10 (September 27, 2007) Some Puzzles Teams A and B are equally good In any one game, each

More information

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom Conditional Probability, Independence and Bayes Theorem Class 3, 18.05 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence of events. 2.

More information

Probabilistic models

Probabilistic models Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became

More information

Uncertainty. Russell & Norvig Chapter 13.

Uncertainty. Russell & Norvig Chapter 13. Uncertainty Russell & Norvig Chapter 13 http://toonut.com/wp-content/uploads/2011/12/69wp.jpg Uncertainty Let A t be the action of leaving for the airport t minutes before your flight Will A t get you

More information

18.05 Problem Set 5, Spring 2014 Solutions

18.05 Problem Set 5, Spring 2014 Solutions 8.0 Problem Set, Spring 04 Solutions Problem. (0 pts.) (a) We know that y i ax i b = ε i N(0,σ ). y i = ε i + ax i + b N(ax i + b, σ ). That is, Therefore f(y i a, b, x i,σ)= e (y i ax i b) σ π σ. (b)

More information

2.4 Conditional Probability

2.4 Conditional Probability 2.4 Conditional Probability The probabilities assigned to various events depend on what is known about the experimental situation when the assignment is made. Example: Suppose a pair of dice is tossed.

More information

Probability: Part 1 Naima Hammoud

Probability: Part 1 Naima Hammoud Probability: Part 1 Naima ammoud Feb 7, 2017 Motivation ossing a coin Rolling a die Outcomes: eads or ails Outcomes: 1, 2, 3, 4, 5 or 6 Defining Probability If I toss a coin, there is a 50% chance I will

More information

Lecture 1: Basics of Probability

Lecture 1: Basics of Probability Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning

More information

MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability

MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability MAE 493G, CpE 493M, Mobile Robotics 6. Basic Probability Instructor: Yu Gu, Fall 2013 Uncertainties in Robotics Robot environments are inherently unpredictable; Sensors and data acquisition systems are

More information

Joint, Conditional, & Marginal Probabilities

Joint, Conditional, & Marginal Probabilities Joint, Conditional, & Marginal Probabilities Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how

More information

Probability. VCE Maths Methods - Unit 2 - Probability

Probability. VCE Maths Methods - Unit 2 - Probability Probability Probability Tree diagrams La ice diagrams Venn diagrams Karnough maps Probability tables Union & intersection rules Conditional probability Markov chains 1 Probability Probability is the mathematics

More information

Where are we in CS 440?

Where are we in CS 440? Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Making decisions under uncertainty

More information

Quantitative Understanding in Biology 1.7 Bayesian Methods

Quantitative Understanding in Biology 1.7 Bayesian Methods Quantitative Understanding in Biology 1.7 Bayesian Methods Jason Banfelder October 25th, 2018 1 Introduction So far, most of the methods we ve looked at fall under the heading of classical, or frequentist

More information

Conditional Probability P( )

Conditional Probability P( ) Conditional Probability P( ) 1 conditional probability where P(F) > 0 Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F Written as P(E F) Means

More information

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes.

Consider an experiment that may have different outcomes. We are interested to know what is the probability of a particular set of outcomes. CMSC 310 Artificial Intelligence Probabilistic Reasoning and Bayesian Belief Networks Probabilities, Random Variables, Probability Distribution, Conditional Probability, Joint Distributions, Bayes Theorem

More information

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln

Discrete Mathematics and Probability Theory Fall 2010 Tse/Wagner MT 2 Soln CS 70 Discrete Mathematics and Probability heory Fall 00 se/wagner M Soln Problem. [Rolling Dice] (5 points) You roll a fair die three times. Consider the following events: A first roll is a 3 B second

More information

Expected Value II. 1 The Expected Number of Events that Happen

Expected Value II. 1 The Expected Number of Events that Happen 6.042/18.062J Mathematics for Computer Science December 5, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Expected Value II 1 The Expected Number of Events that Happen Last week we concluded by showing

More information

CS 188: Artificial Intelligence. Our Status in CS188

CS 188: Artificial Intelligence. Our Status in CS188 CS 188: Artificial Intelligence Probability Pieter Abbeel UC Berkeley Many slides adapted from Dan Klein. 1 Our Status in CS188 We re done with Part I Search and Planning! Part II: Probabilistic Reasoning

More information

Uncertainty. Michael Peters December 27, 2013

Uncertainty. Michael Peters December 27, 2013 Uncertainty Michael Peters December 27, 20 Lotteries In many problems in economics, people are forced to make decisions without knowing exactly what the consequences will be. For example, when you buy

More information

ORF 245 Fundamentals of Statistics Chapter 5 Probability

ORF 245 Fundamentals of Statistics Chapter 5 Probability ORF 245 Fundamentals of Statistics Chapter 5 Probability Robert Vanderbei Oct 2015 Slides last edited on October 14, 2015 http://www.princeton.edu/ rvdb Sample Spaces (aka Populations) and Events When

More information

Formal Modeling in Cognitive Science

Formal Modeling in Cognitive Science Formal Modeling in Cognitive Science Lecture 9: Application of Bayes Theorem; Discrete Random Variables; Steve Renals (notes by Frank Keller) School of Informatics University of Edinburgh s.renals@ed.ac.uk

More information

Lecture 2. Conditional Probability

Lecture 2. Conditional Probability Math 408 - Mathematical Statistics Lecture 2. Conditional Probability January 18, 2013 Konstantin Zuev (USC) Math 408, Lecture 2 January 18, 2013 1 / 9 Agenda Motivation and Definition Properties of Conditional

More information

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B). This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this

More information

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background. Formal Modeling in Cognitive Science Lecture 9: ; Discrete Random Variables; Steve Renals (notes by Frank Keller) School of Informatics University of Edinburgh s.renals@ed.ac.uk February 7 Probability

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205)

3.2 Intoduction to probability 3.3 Probability rules. Sections 3.2 and 3.3. Elementary Statistics for the Biological and Life Sciences (Stat 205) 3.2 Intoduction to probability Sections 3.2 and 3.3 Elementary Statistics for the Biological and Life Sciences (Stat 205) 1 / 47 Probability 3.2 Intoduction to probability The probability of an event E

More information

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire Uncertainty Uncertainty Russell & Norvig Chapter 13 Let A t be the action of leaving for the airport t minutes before your flight Will A t get you there on time? A purely logical approach either 1. risks

More information

6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables

6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld. Random Variables 6.042/18.062J Mathematics for Computer Science November 28, 2006 Tom Leighton and Ronitt Rubinfeld Lecture Notes Random Variables We ve used probablity to model a variety of experiments, games, and tests.

More information

Lecture 6 - Random Variables and Parameterized Sample Spaces

Lecture 6 - Random Variables and Parameterized Sample Spaces Lecture 6 - Random Variables and Parameterized Sample Spaces 6.042 - February 25, 2003 We ve used probablity to model a variety of experiments, games, and tests. Throughout, we have tried to compute probabilities

More information

COMP61011 : Machine Learning. Probabilis*c Models + Bayes Theorem

COMP61011 : Machine Learning. Probabilis*c Models + Bayes Theorem COMP61011 : Machine Learning Probabilis*c Models + Bayes Theorem Probabilis*c Models - one of the most active areas of ML research in last 15 years - foundation of numerous new technologies - enables decision-making

More information

Probability (Devore Chapter Two)

Probability (Devore Chapter Two) Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Fall 2012 Contents 0 Administrata 2 0.1 Outline....................................... 3 1 Axiomatic Probability 3

More information

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is

Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is Chapter 13, Probability from Applied Finite Mathematics by Rupinder Sekhon was developed by OpenStax College, licensed by Rice University, and is available on the Connexions website. It is used under a

More information

V. Probability. by David M. Lane and Dan Osherson

V. Probability. by David M. Lane and Dan Osherson V. Probability by David M. Lane and Dan Osherson Prerequisites none F.Introduction G.Basic Concepts I.Gamblers Fallacy Simulation K.Binomial Distribution L.Binomial Demonstration M.Base Rates Probability

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics March 14, 2018 CS 361: Probability & Statistics Inference The prior From Bayes rule, we know that we can express our function of interest as Likelihood Prior Posterior The right hand side contains the

More information

Elementary Discrete Probability

Elementary Discrete Probability Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,

More information

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B). Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,

More information

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

Lecture Stat 302 Introduction to Probability - Slides 5

Lecture Stat 302 Introduction to Probability - Slides 5 Lecture Stat 302 Introduction to Probability - Slides 5 AD Jan. 2010 AD () Jan. 2010 1 / 20 Conditional Probabilities Conditional Probability. Consider an experiment with sample space S. Let E and F be

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics September 12, 2017 CS 361: Probability & Statistics Correlation Summary of what we proved We wanted a way of predicting y from x We chose to think in standard coordinates and to use a linear predictor

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

The rest of the course

The rest of the course The rest of the course Subtleties involved with maximizing expected utility: Finding the right state space: The wrong state space leads to intuitively incorrect answers when conditioning Taking causality

More information

Probabilistic models

Probabilistic models Kolmogorov (Andrei Nikolaevich, 1903 1987) put forward an axiomatic system for probability theory. Foundations of the Calculus of Probabilities, published in 1933, immediately became the definitive formulation

More information

Econ 325: Introduction to Empirical Economics

Econ 325: Introduction to Empirical Economics Econ 325: Introduction to Empirical Economics Lecture 2 Probability Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 3-1 3.1 Definition Random Experiment a process leading to an uncertain

More information

What is Probability? Probability. Sample Spaces and Events. Simple Event

What is Probability? Probability. Sample Spaces and Events. Simple Event What is Probability? Probability Peter Lo Probability is the numerical measure of likelihood that the event will occur. Simple Event Joint Event Compound Event Lies between 0 & 1 Sum of events is 1 1.5

More information

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Today we ll discuss ways to learn how to think about events that are influenced by chance. Overview Today we ll discuss ways to learn how to think about events that are influenced by chance. Basic probability: cards, coins and dice Definitions and rules: mutually exclusive events and independent

More information

Probability (Devore Chapter Two)

Probability (Devore Chapter Two) Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 0 Preliminaries 3 0.1 Motivation..................................... 3 0.2 Administrata...................................

More information

MAT Mathematics in Today's World

MAT Mathematics in Today's World MAT 1000 Mathematics in Today's World Last Time We discussed the four rules that govern probabilities: 1. Probabilities are numbers between 0 and 1 2. The probability an event does not occur is 1 minus

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics October 17, 2017 CS 361: Probability & Statistics Inference Maximum likelihood: drawbacks A couple of things might trip up max likelihood estimation: 1) Finding the maximum of some functions can be quite

More information

Outline. Introduction. Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle

Outline. Introduction. Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle Outline Introduction Bayesian Probability Theory Bayes rule Bayes rule applied to learning Bayesian learning and the MDL principle Sequence Prediction and Data Compression Bayesian Networks Copyright 2015

More information

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T

P Q (P Q) (P Q) (P Q) (P % Q) T T T T T T T F F T F F F T F T T T F F F F T T Logic and Reasoning Final Exam Practice Fall 2017 Name Section Number The final examination is worth 100 points. 1. (10 points) What is an argument? Explain what is meant when one says that logic is the

More information

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012

Probability Hal Daumé III. Computer Science University of Maryland CS 421: Introduction to Artificial Intelligence 27 Mar 2012 1 Hal Daumé III (me@hal3.name) Probability 101++ Hal Daumé III Computer Science University of Maryland me@hal3.name CS 421: Introduction to Artificial Intelligence 27 Mar 2012 Many slides courtesy of Dan

More information

Where are we in CS 440?

Where are we in CS 440? Where are we in CS 440? Now leaving: sequential deterministic reasoning Entering: probabilistic reasoning and machine learning robability: Review of main concepts Chapter 3 Motivation: lanning under uncertainty

More information

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4. I. Probability basics (Sections 4.1 and 4.2) Flip a fair (probability of HEADS is 1/2) coin ten times. What is the probability of getting exactly 5 HEADS? What is the probability of getting exactly 10

More information

Examples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc

Examples of frequentist probability include games of chance, sample surveys, and randomized experiments. We will focus on frequentist probability sinc FPPA-Chapters 13,14 and parts of 16,17, and 18 STATISTICS 50 Richard A. Berk Spring, 1997 May 30, 1997 1 Thinking about Chance People talk about \chance" and \probability" all the time. There are many

More information

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek

More information

Hypothesis testing I. - In particular, we are talking about statistical hypotheses. [get everyone s finger length!] n =

Hypothesis testing I. - In particular, we are talking about statistical hypotheses. [get everyone s finger length!] n = Hypothesis testing I I. What is hypothesis testing? [Note we re temporarily bouncing around in the book a lot! Things will settle down again in a week or so] - Exactly what it says. We develop a hypothesis,

More information

MATH 10 INTRODUCTORY STATISTICS

MATH 10 INTRODUCTORY STATISTICS MATH 10 INTRODUCTORY STATISTICS Ramesh Yapalparvi Week 2 Chapter 4 Bivariate Data Data with two/paired variables, Pearson correlation coefficient and its properties, general variance sum law Chapter 6

More information

Our Status. We re done with Part I Search and Planning!

Our Status. We re done with Part I Search and Planning! Probability [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at http://ai.berkeley.edu.] Our Status We re done with Part

More information

UNIT 5 ~ Probability: What Are the Chances? 1

UNIT 5 ~ Probability: What Are the Chances? 1 UNIT 5 ~ Probability: What Are the Chances? 1 6.1: Simulation Simulation: The of chance behavior, based on a that accurately reflects the phenomenon under consideration. (ex 1) Suppose we are interested

More information

Probability in Programming. Prakash Panangaden School of Computer Science McGill University

Probability in Programming. Prakash Panangaden School of Computer Science McGill University Probability in Programming Prakash Panangaden School of Computer Science McGill University Two crucial issues Two crucial issues Correctness of software: Two crucial issues Correctness of software: with

More information

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev CS4705 Probability Review and Naïve Bayes Slides from Dragomir Radev Classification using a Generative Approach Previously on NLP discriminative models P C D here is a line with all the social media posts

More information

CS1800: Strong Induction. Professor Kevin Gold

CS1800: Strong Induction. Professor Kevin Gold CS1800: Strong Induction Professor Kevin Gold Mini-Primer/Refresher on Unrelated Topic: Limits This is meant to be a problem about reasoning about quantifiers, with a little practice of other skills, too

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

Discrete Probability

Discrete Probability Discrete Probability Mark Muldoon School of Mathematics, University of Manchester M05: Mathematical Methods, January 30, 2007 Discrete Probability - p. 1/38 Overview Mutually exclusive Independent More

More information

Maximizing expected utility

Maximizing expected utility Maximizing expected utility Earlier we looked at many decision rules: maximin minimax regret principle of insufficient reason... The most commonly used rule (and the one taught in business schools!) is

More information

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics? STA111 - Lecture 1 Welcome to STA111! Some basic information: Instructor: Víctor Peña (email: vp58@duke.edu) Course Website: http://stat.duke.edu/~vp58/sta111. 1 What is the difference between Probability

More information

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102 Mean, Median and Mode Lecture 3 - Axioms of Probability Sta102 / BME102 Colin Rundel September 1, 2014 We start with a set of 21 numbers, ## [1] -2.2-1.6-1.0-0.5-0.4-0.3-0.2 0.1 0.1 0.2 0.4 ## [12] 0.4

More information

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation

Contents. Decision Making under Uncertainty 1. Meanings of uncertainty. Classical interpretation Contents Decision Making under Uncertainty 1 elearning resources Prof. Ahti Salo Helsinki University of Technology http://www.dm.hut.fi Meanings of uncertainty Interpretations of probability Biases in

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

P (B)P (A B) = P (AB) = P (A)P (B A), and dividing both sides by P (B) gives Bayes rule: P (A B) = P (A) P (B A) P (B),

P (B)P (A B) = P (AB) = P (A)P (B A), and dividing both sides by P (B) gives Bayes rule: P (A B) = P (A) P (B A) P (B), Conditional probability 18.600 Problem Set 3, due March 2 Welcome to your third 18.600 problem set! Conditional probability is defined by P (A B) = P (AB)/P (B), which implies P (B)P (A B) = P (AB) = P

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

Lecture 3 Probability Basics

Lecture 3 Probability Basics Lecture 3 Probability Basics Thais Paiva STA 111 - Summer 2013 Term II July 3, 2013 Lecture Plan 1 Definitions of probability 2 Rules of probability 3 Conditional probability What is Probability? Probability

More information

Term Definition Example Random Phenomena

Term Definition Example Random Phenomena UNIT VI STUDY GUIDE Probabilities Course Learning Outcomes for Unit VI Upon completion of this unit, students should be able to: 1. Apply mathematical principles used in real-world situations. 1.1 Demonstrate

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

CS 188: Artificial Intelligence Fall 2009

CS 188: Artificial Intelligence Fall 2009 CS 188: Artificial Intelligence Fall 2009 Lecture 13: Probability 10/8/2009 Dan Klein UC Berkeley 1 Announcements Upcoming P3 Due 10/12 W2 Due 10/15 Midterm in evening of 10/22 Review sessions: Probability

More information

Steve Smith Tuition: Maths Notes

Steve Smith Tuition: Maths Notes Maths Notes : Discrete Random Variables Version. Steve Smith Tuition: Maths Notes e iπ + = 0 a + b = c z n+ = z n + c V E + F = Discrete Random Variables Contents Intro The Distribution of Probabilities

More information

1 True/False. Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao

1 True/False. Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao Math 10B with Professor Stankova Worksheet, Discussion #9; Thursday, 2/15/2018 GSI name: Roy Zhao 1 True/False 1. True False When we solve a problem one way, it is not useful to try to solve it in a second

More information

ORF 245 Fundamentals of Statistics Chapter 1 Probability

ORF 245 Fundamentals of Statistics Chapter 1 Probability ORF 245 Fundamentals of Statistics Chapter 1 Probability Robert Vanderbei Sept 2014 Slides last edited on September 19, 2014 http://www.princeton.edu/ rvdb Course Info Prereqs: Textbook: Three semesters

More information