1.7: Bayes Theorem. Jiakun Pan. Feb 4, 2019

Size: px
Start display at page:

Download "1.7: Bayes Theorem. Jiakun Pan. Feb 4, 2019"

Transcription

1 1.7: Bayes Theorem Jiakun Pan Feb 4, 2019

2 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part:

3 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically.

4 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised.

5 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper.

6 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper. To diagnose cold symptoms, physicians may ask patients if they have traveled outside the U.S. within 30 days.

7 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper. To diagnose cold symptoms, physicians may ask patients if they have traveled outside the U.S. within 30 days. Recommendation letter matters heavily in application of jobs in modern society. Etc..

8 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part.

9 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part. Then Bayesian reasoning comes to help.

10 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part. Then Bayesian reasoning comes to help. Recall the GRE sample problem...

11 GRE Sample There are two boxes each of which contains four balls. Box A has one red ball and three black balls, while box B has two red balls and two black balls. You randomly pick a ball from them, and it turns out to be red. Question:What is the probability of the ball is from box A? Answer:1/3.

12 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of

13 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red.

14 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red. By definition, we know P(E F ) = P(E F ), P(F )

15 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red. By definition, we know P(E F ) = P(E F ), P(F ) and the probability of the picked ball is the red ball of box A is P(E F ) = n(e F ) n(s) = 1 8, so it boils down to determine P(F ).

16 Solving the Sample Problem (Continued) By definition, P(F ) = n(f ) n(s) = 3 8.

17 Solving the Sample Problem (Continued) By definition, So the answer is P(F ) = n(f ) n(s) = 3 8. P(E F ) = 1/8 3/8 = 1 3.

18 Solving the Sample Problem (Continued) By definition, So the answer is P(F ) = n(f ) n(s) = 3 8. P(E F ) = 1/8 3/8 = 1 3. Bayes reasoning will be crucial when P(F ) can t be found easily, for example, when n(s) and n(f ) are unknown, or when probability is provided after the first step, so that you only have partial information on P(F ).

19 Sketch of a Novel A millionaire was murdered.

20 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects.

21 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects. A, maid of the victim; B, manager of the millionaire s family foundation; and C, professional assassin who killed for money.

22 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects. A, maid of the victim; B, manager of the millionaire s family foundation; and C, professional assassin who killed for money. The detective also found the crime was committed in one blow by one person. In other words, only one of A, B, and C did try to murder, and the other two were sleeping alone at home.

23 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively.

24 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt.

25 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt. Similarly, B has 15% successful rate, and C has 95%.

26 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt. Similarly, B has 15% successful rate, and C has 95%. Question: How suspicious is each of these suspects?

27 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful?

28 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ).

29 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, and by product rule we have that P(E 1 F ) = P(E 1 F ), (1) P(F )

30 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, P(E 1 F ) = P(E 1 F ), (1) P(F ) and by product rule we have that P(E 1 F ) = P(E 1 )P(F E 1 ) = 60% 5% = 0.03.

31 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, P(E 1 F ) = P(E 1 F ), (1) P(F ) and by product rule we have that P(E 1 F ) = P(E 1 )P(F E 1 ) = 60% 5% = Similarly, P(E 2 F ) = 35% 15% = , and P(E 3 F ) = 5% 95% = It remains to find P(F ).

32 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so

33 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F.

34 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap.

35 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us

36 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us P(F ) = P(E 1 F ) + P(E 2 F ) + P(E 3 F ) = = 0.13, and after applying formula (1), the answer comes out.

37 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us P(F ) = P(E 1 F ) + P(E 2 F ) + P(E 3 F ) = = 0.13, and after applying formula (1), the answer comes out. Answer: The probability of A, B, and C to commit the murder is 23.08%, 40.38%, and 36.54%, respectively.

38 The Bayes Theorem Now we can formulate the theorem.

39 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0.

40 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0. Then for any i = 1, 2,..., m, we have

41 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0. Then for any i = 1, 2,..., m, we have P(E i F ) = P(E i )P(F E i ) P(E 1 )P(F E 1 ) + P(E 2 )P(F E 2 ) P(E m )P(F E m ), in the name of Presbyterian minister Thomas Bayes ( ).

42 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section.

43 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability.

44 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better.

45 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better. Probability trees will be useful for these problems.

46 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better. Probability trees will be useful for these problems. It needs more practice to build the skills of problem solving.

With Question/Answer Animations. Chapter 7

With Question/Answer Animations. Chapter 7 With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements

More information

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete

More information

STOR 435 Lecture 5. Conditional Probability and Independence - I

STOR 435 Lecture 5. Conditional Probability and Independence - I STOR 435 Lecture 5 Conditional Probability and Independence - I Jan Hannig UNC Chapel Hill 1 / 16 Motivation Basic point Think of probability as the amount of belief we have in a particular outcome. If

More information

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)

Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) 1 Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) On this exam, questions may come from any of the following topic areas: - Union and intersection of sets - Complement of

More information

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018 Lecture 04: Conditional Probability Lisa Yan July 2, 2018 Announcements Problem Set #1 due on Friday Gradescope submission portal up Use Piazza No class or OH on Wednesday July 4 th 2 Summary from last

More information

CSC Discrete Math I, Spring Discrete Probability

CSC Discrete Math I, Spring Discrete Probability CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields

More information

2.4. Conditional Probability

2.4. Conditional Probability 2.4. Conditional Probability Objectives. Definition of conditional probability and multiplication rule Total probability Bayes Theorem Example 2.4.1. (#46 p.80 textbook) Suppose an individual is randomly

More information

Conditional Probability P( )

Conditional Probability P( ) Conditional Probability P( ) 1 conditional probability and the chain rule General defn: where P(F) > 0 Implies: P(EF) = P(E F) P(F) ( the chain rule ) General definition of Chain Rule: 2 Best of 3 tournament

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

Lecture 3: Probability

Lecture 3: Probability Lecture 3: Probability 28th of October 2015 Lecture 3: Probability 28th of October 2015 1 / 36 Summary of previous lecture Define chance experiment, sample space and event Introduce the concept of the

More information

Probability, Entropy, and Inference / More About Inference

Probability, Entropy, and Inference / More About Inference Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference

More information

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head} Chapter 7 Notes 1 (c) Epstein, 2013 CHAPTER 7: PROBABILITY 7.1: Experiments, Sample Spaces and Events Chapter 7 Notes 2 (c) Epstein, 2013 What is the sample space for flipping a fair coin three times?

More information

Machine Learning, Fall 2009: Midterm

Machine Learning, Fall 2009: Midterm 10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all

More information

Formalizing Probability. Choosing the Sample Space. Probability Measures

Formalizing Probability. Choosing the Sample Space. Probability Measures Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take

More information

3 PROBABILITY TOPICS

3 PROBABILITY TOPICS Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary

More information

Bayes Rule: A Tutorial Introduction ( www45w9klk)

Bayes Rule: A Tutorial Introduction (  www45w9klk) Bayes Rule: A Tutorial Introduction (https://docs.google.com/document/pub?id=1qm4hj4xmmowfvsklgcqpudojya5qcnii_ www45w9klk) by JV Stone 1 Introduction All decisions are based on data, but the best decisions

More information

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Drama Scheme of Work Murder Mystery

Drama Scheme of Work Murder Mystery Drama Scheme of Work Murder Mystery This scheme of work should last for 2-3 lessons. It allows the pupils to be part of and try to solve a murder mystery. Pupils are encouraged to improvise and develop

More information

Probability - Lecture 4

Probability - Lecture 4 1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the

More information

2. Probability. Chris Piech and Mehran Sahami. Oct 2017

2. Probability. Chris Piech and Mehran Sahami. Oct 2017 2. Probability Chris Piech and Mehran Sahami Oct 2017 1 Introduction It is that time in the quarter (it is still week one) when we get to talk about probability. Again we are going to build up from first

More information

IE 230 Seat # (1 point) Name (clearly) < KEY > Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited.

IE 230 Seat # (1 point) Name (clearly) < KEY > Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited. Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited. Cover page, four pages of exam. This test covers through Section 2.7 of Montgomery and Runger, fourth

More information

Probability and Samples. Sampling. Point Estimates

Probability and Samples. Sampling. Point Estimates Probability and Samples Sampling We want the results from our sample to be true for the population and not just the sample But our sample may or may not be representative of the population Sampling error

More information

Axioms of Probability

Axioms of Probability Sample Space (denoted by S) The set of all possible outcomes of a random experiment is called the Sample Space of the experiment, and is denoted by S. Example 1.10 If the experiment consists of tossing

More information

Problem #1 #2 #3 #4 Extra Total Points /3 /13 /7 /10 /4 /33

Problem #1 #2 #3 #4 Extra Total Points /3 /13 /7 /10 /4 /33 STAT/MATH 394 A - Autumn Quarter 206 - Midterm - October 2, 206 Name: Student ID Number: Problem # #2 #3 #4 Extra Total Points /3 /3 /7 /0 /4 /33 Read directions carefully and show all your work. Particularly,

More information

Conditional Probability & Independence. Conditional Probabilities

Conditional Probability & Independence. Conditional Probabilities Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F

More information

Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory

Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory The Fall 2012 Stat 225 T.A.s September 7, 2012 The material in this handout is intended to cover general set theory topics. Information includes (but

More information

Solving Equations by Adding and Subtracting

Solving Equations by Adding and Subtracting SECTION 2.1 Solving Equations by Adding and Subtracting 2.1 OBJECTIVES 1. Determine whether a given number is a solution for an equation 2. Use the addition property to solve equations 3. Determine whether

More information

Module 2 : Conditional Probability

Module 2 : Conditional Probability Module 2 : Conditional Probability Ruben Zamar Department of Statistics UBC January 16, 2017 Ruben Zamar Department of Statistics UBC Module () 2 January 16, 2017 1 / 61 MOTIVATION The outcome could be

More information

Review. A Bernoulli Trial is a very simple experiment:

Review. A Bernoulli Trial is a very simple experiment: Review A Bernoulli Trial is a very simple experiment: Review A Bernoulli Trial is a very simple experiment: two possible outcomes (success or failure) probability of success is always the same (p) the

More information

Math 2534 Solution Homework 2 sec

Math 2534 Solution Homework 2 sec Math 2534 Solution Homework 2 sec. 2.1-2.2-2.3 Problem 1: Use Algebra of Logic to Prove the following: [( p q) ( p q)] heorem: [( p q) ( p q)] Pr oof : [( p q) ( p q)] q given [( p q) ( p q)] q Implication

More information

PHY 123 Lab 1 - Error and Uncertainty and the Simple Pendulum

PHY 123 Lab 1 - Error and Uncertainty and the Simple Pendulum To print higher-resolution math symbols, click the Hi-Res Fonts for Printing button on the jsmath control panel. PHY 13 Lab 1 - Error and Uncertainty and the Simple Pendulum Important: You need to print

More information

Basic Probabilistic Reasoning SEG

Basic Probabilistic Reasoning SEG Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision

More information

Examination Artificial Intelligence Module Intelligent Interaction Design December 2014

Examination Artificial Intelligence Module Intelligent Interaction Design December 2014 Examination Artificial Intelligence Module Intelligent Interaction Design December 2014 Introduction This exam is closed book, you may only use a simple calculator (addition, substraction, multiplication

More information

Discussion 01. b) What is the probability that the letter selected is a vowel?

Discussion 01. b) What is the probability that the letter selected is a vowel? STAT 400 Discussion 01 Spring 2018 1. Consider the following experiment: A letter is chosen at random from the word STATISTICS. a) List all possible outcomes and their probabilities. b) What is the probability

More information

PROBABILITY: CONDITIONING, BAYES THEOREM [DEVORE 2.4]

PROBABILITY: CONDITIONING, BAYES THEOREM [DEVORE 2.4] PROBABILITY: CONDITIONING, BAYES THEOREM [DEVORE 2.4] CONDITIONAL PROBABILITY: Let events E, F be events in the sample space Ω of an experiment. Then: The conditional probability of F given E, P(F E),

More information

The Norwood Builder and Other Stories

The Norwood Builder and Other Stories The Norwood Builder and Other Stories Sir Arthur Conan Doyle A Before Reading 1 a What do you already know about Sherlock Holmes, the main character in the stories? Complete the table. Sherlock Holmes

More information

P (E) = P (A 1 )P (A 2 )... P (A n ).

P (E) = P (A 1 )P (A 2 )... P (A n ). Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer

More information

Exam 1 Review With Solutions Instructor: Brian Powers

Exam 1 Review With Solutions Instructor: Brian Powers Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?

More information

Midterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so.

Midterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so. CS 89 Spring 07 Introduction to Machine Learning Midterm Please do not open the exam before you are instructed to do so. The exam is closed book, closed notes except your one-page cheat sheet. Electronic

More information

Independence, Concentration, Bayes Theorem

Independence, Concentration, Bayes Theorem Independence, Concentration, Bayes Theorem CSE21 Winter 2017, Day 23 (B00), Day 15 (A00) March 10, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Other functions? Expectation does not in general commute

More information

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use? Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates

More information

Independence, Variance, Bayes Theorem

Independence, Variance, Bayes Theorem Independence, Variance, Bayes Theorem Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 16, 2016 Resolving collisions with chaining Hash

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Bayesian Updating: Discrete Priors: Spring

Bayesian Updating: Discrete Priors: Spring Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:

More information

MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability

MAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability MAE 493G, CpE 493M, Mobile Robotics 6. Basic Probability Instructor: Yu Gu, Fall 2013 Uncertainties in Robotics Robot environments are inherently unpredictable; Sensors and data acquisition systems are

More information

Math 10 - Compilation of Sample Exam Questions + Answers

Math 10 - Compilation of Sample Exam Questions + Answers Math 10 - Compilation of Sample Exam Questions + Sample Exam Question 1 We have a population of size N. Let p be the independent probability of a person in the population developing a disease. Answer the

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 1996

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 1996 EXAMINATIONS OF THE ROAL STATISTICAL SOCIET (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 996 Paper I : Statistical Theory Time Allowed: Three Hours Candidates

More information

MATH 556: PROBABILITY PRIMER

MATH 556: PROBABILITY PRIMER MATH 6: PROBABILITY PRIMER 1 DEFINITIONS, TERMINOLOGY, NOTATION 1.1 EVENTS AND THE SAMPLE SPACE Definition 1.1 An experiment is a one-off or repeatable process or procedure for which (a there is a well-defined

More information

CS 453 Operating Systems. Lecture 7 : Deadlock

CS 453 Operating Systems. Lecture 7 : Deadlock CS 453 Operating Systems Lecture 7 : Deadlock 1 What is Deadlock? Every New Yorker knows what a gridlock alert is - it s one of those days when there is so much traffic that nobody can move. Everything

More information

Elementary Discrete Probability

Elementary Discrete Probability Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,

More information

Probabilities and Expectations

Probabilities and Expectations Probabilities and Expectations Ashique Rupam Mahmood September 9, 2015 Probabilities tell us about the likelihood of an event in numbers. If an event is certain to occur, such as sunrise, probability of

More information

CS1512 Foundations of Computing Science 2. Lecture 4

CS1512 Foundations of Computing Science 2. Lecture 4 CS1512 Foundations of Computing Science 2 Lecture 4 Bayes Law; Gaussian Distributions 1 J R W Hunter, 2006; C J van Deemter 2007 (Revd. Thomas) Bayes Theorem P( E 1 and E 2 ) = P( E 1 )* P( E 2 E 1 ) Order

More information

Discrete Probability

Discrete Probability MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability

More information

Lecture 2. Conditional Probability

Lecture 2. Conditional Probability Math 408 - Mathematical Statistics Lecture 2. Conditional Probability January 18, 2013 Konstantin Zuev (USC) Math 408, Lecture 2 January 18, 2013 1 / 9 Agenda Motivation and Definition Properties of Conditional

More information

Test One Mathematics Fall 2009

Test One Mathematics Fall 2009 Test One Mathematics 35.2 Fall 29 TO GET FULL CREDIT YOU MUST SHOW ALL WORK! I have neither given nor received aid in the completion of this test. Signature: pts. 2 pts. 3 5 pts. 2 pts. 5 pts. 6(i) pts.

More information

Conditional probability

Conditional probability CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will

More information

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University Chapter 3 Conditional Probability and Independence Wen-Guey Tzeng Computer Science Department National Chiao Tung University Conditional probability P(A B) = the probability of event A given the occurrence

More information

Chapter Six. Approaches to Assigning Probabilities

Chapter Six. Approaches to Assigning Probabilities Chapter Six Probability 6.1 Approaches to Assigning Probabilities There are three ways to assign a probability, P(O i ), to an outcome, O i, namely: Classical approach: based on equally likely events.

More information

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University

Chapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University Chapter 3 Conditional Probability and Independence Wen-Guey Tzeng Computer Science Department National Chiao Tung University Conditional probability P(A B) = the probability of event A given the occurrence

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Final Exam Review Session Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Notes 140608 Review Things to Know has

More information

CS 188: Artificial Intelligence Spring Today

CS 188: Artificial Intelligence Spring Today CS 188: Artificial Intelligence Spring 2006 Lecture 9: Naïve Bayes 2/14/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Bayes rule Today Expectations and utilities Naïve

More information

Joint, Conditional, & Marginal Probabilities

Joint, Conditional, & Marginal Probabilities Joint, Conditional, & Marginal Probabilities Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Overview The concept of probability is commonly used in everyday life, and can be expressed in many ways. For example, there is a 50:50 chance of a head when a fair coin

More information

Bayesian Updating: Discrete Priors: Spring

Bayesian Updating: Discrete Priors: Spring Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:

More information

Time: 1 hour 30 minutes

Time: 1 hour 30 minutes Paper Reference(s) 668/0 Edexcel GCE Statistics S Silver Level S2 Time: hour 0 minutes Materials required for examination papers Mathematical Formulae (Green) Items included with question Nil Candidates

More information

Lecture 3 : Probability II. Jonathan Marchini

Lecture 3 : Probability II. Jonathan Marchini Lecture 3 : Probability II Jonathan Marchini Puzzle 1 Pick any two types of card that can occur in a normal pack of shuffled playing cards e.g. Queen and 6. What do you think is the probability that somewhere

More information

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}

STAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i} STAT 56 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS 2. We note that E n consists of rolls that end in 6, namely, experiments of the form (a, a 2,...,a n, 6 for n and a i

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional

More information

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan

Binomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan 9.0 Lesson Plan Discuss Quizzes/Answer Questions History Note Review Permutations and Combinations Binomial Probability 1 9.1 History Note Pascal and Fermat laid out the basic rules of probability in a

More information

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo

4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo 4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo 1 conditional probability Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F S Written

More information

Chapter 3 : Conditional Probability and Independence

Chapter 3 : Conditional Probability and Independence STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when

More information

More on conditioning and Mr. Bayes

More on conditioning and Mr. Bayes More on conditioning and Mr. Bayes Saad Mneimneh 1 Multiplication rule for conditioning We can generalize the formula P(A,B) P(A B)P(B) to more than two events. For instance, P(A,B,C) P(A)P(B A)P(C A,B).

More information

CS 188: Artificial Intelligence Fall 2008

CS 188: Artificial Intelligence Fall 2008 CS 188: Artificial Intelligence Fall 2008 Lecture 23: Perceptrons 11/20/2008 Dan Klein UC Berkeley 1 General Naïve Bayes A general naive Bayes model: C E 1 E 2 E n We only specify how each feature depends

More information

General Naïve Bayes. CS 188: Artificial Intelligence Fall Example: Overfitting. Example: OCR. Example: Spam Filtering. Example: Spam Filtering

General Naïve Bayes. CS 188: Artificial Intelligence Fall Example: Overfitting. Example: OCR. Example: Spam Filtering. Example: Spam Filtering CS 188: Artificial Intelligence Fall 2008 General Naïve Bayes A general naive Bayes model: C Lecture 23: Perceptrons 11/20/2008 E 1 E 2 E n Dan Klein UC Berkeley We only specify how each feature depends

More information

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1

Topic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1 Topic 5: Probability Standard Level 5.4 Combined Events and Conditional Probability Paper 1 1. In a group of 16 students, 12 take art and 8 take music. One student takes neither art nor music. The Venn

More information

( ) P A B : Probability of A given B. Probability that A happens

( ) P A B : Probability of A given B. Probability that A happens A B A or B One or the other or both occurs At least one of A or B occurs Probability Review A B A and B Both A and B occur ( ) P A B : Probability of A given B. Probability that A happens given that B

More information

Some Concepts of Probability (Review) Volker Tresp Summer 2018

Some Concepts of Probability (Review) Volker Tresp Summer 2018 Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov

More information

Lecture 2: Bayesian Classification

Lecture 2: Bayesian Classification Lecture 2: Bayesian Classification 1 Content Reminders from previous lecture Historical note Conditional probability Bayes theorem Bayesian classifiers Example 1: Marketing promotions Example 2: Items

More information

Use Newton s law of cooling to narrow down the number of suspects by determining when the victim was killed.

Use Newton s law of cooling to narrow down the number of suspects by determining when the victim was killed. Case File 14 Hot Air, Cold Body: Using Newton s law of cooling to determine time of death Use Newton s law of cooling to narrow down the number of suspects by determining when the victim was killed. Memo

More information

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Probability 2017/18 Supplementary Problems Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A

More information

Overview. Overview. Overview. Specific Examples. General Examples. Bivariate Regression & Correlation

Overview. Overview. Overview. Specific Examples. General Examples. Bivariate Regression & Correlation Bivariate Regression & Correlation Overview The Scatter Diagram Two Examples: Education & Prestige Correlation Coefficient Bivariate Linear Regression Line SPSS Output Interpretation Covariance ou already

More information

A Study On Problem Solving Using Bayes Theorem

A Study On Problem Solving Using Bayes Theorem vailable at https://edupediapublications.org/journals Volume 03 Issue 4 October06 Study On roblem Solving Using ayes Theorem Ismael Yaseen bdulridha lasadi M.Sc, pplied Mathematics University College of

More information

Be able to define the following terms and answer basic questions about them:

Be able to define the following terms and answer basic questions about them: CS440/ECE448 Fall 2016 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables o Axioms of probability o Joint, marginal, conditional probability

More information

Probability Theory and Applications

Probability Theory and Applications Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson

More information

Introduction to probability

Introduction to probability Introduction to probability 4.1 The Basics of Probability Probability The chance that a particular event will occur The probability value will be in the range 0 to 1 Experiment A process that produces

More information

Probability and (Bayesian) Data Analysis

Probability and (Bayesian) Data Analysis Department of Statistics The University of Auckland https://www.stat.auckland.ac.nz/ brewer/ Where to get everything To get all of the material (slides, code, exercises): git clone --recursive https://github.com/eggplantbren/madrid

More information

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos

More information

STAT Chapter 3: Probability

STAT Chapter 3: Probability Basic Definitions STAT 515 --- Chapter 3: Probability Experiment: A process which leads to a single outcome (called a sample point) that cannot be predicted with certainty. Sample Space (of an experiment):

More information

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan

Lecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan Lecture 9: Naive Bayes, SVM, Kernels Instructor: Outline 1 Probability basics 2 Probabilistic Interpretation of Classification 3 Bayesian Classifiers, Naive Bayes 4 Support Vector Machines Probability

More information

CS626 Data Analysis and Simulation

CS626 Data Analysis and Simulation CS626 Data Analysis and Simulation Instructor: Peter Kemper R 104A, phone 221-3462, email:kemper@cs.wm.edu Today: Probability Primer Quick Reference: Sheldon Ross: Introduction to Probability Models 9th

More information

Reading for Lecture 6 Release v10

Reading for Lecture 6 Release v10 Reading for Lecture 6 Release v10 Christopher Lee October 11, 2011 Contents 1 The Basics ii 1.1 What is a Hypothesis Test?........................................ ii Example..................................................

More information

Honors Algebra 2 Final Exam 2002

Honors Algebra 2 Final Exam 2002 Honors Algebra 2 Final Exam 2002 Name PART A. MULTIPLE CHOICE. Circle the letter in front of each correct answer. You do not have to show work. There is no partial credit. EACH PROBLEM IN THIS SECTION

More information

3 Conditional Probability

3 Conditional Probability 3 Conditional Probability Question: What are the chances that a college student chosen at random from the U.S. population is a fan of the Notre Dame football team? Now, if the person chosen is a student

More information

Section 13.3 Probability

Section 13.3 Probability 288 Section 13.3 Probability Probability is a measure of how likely an event will occur. When the weather forecaster says that there will be a 50% chance of rain this afternoon, the probability that it

More information

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides

Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides slide 1 Inference with Bayes rule: Example In a bag there are two envelopes one has a red ball (worth $100) and a black ball one

More information

Entropy. Expected Surprise

Entropy. Expected Surprise Entropy Let X be a discrete random variable The surprise of observing X = x is defined as log 2 P(X=x) Surprise of probability 1 is zero. Surprise of probability 0 is (c) 200 Thomas G. Dietterich 1 Expected

More information

Statistical Theory 1

Statistical Theory 1 Statistical Theory 1 Set Theory and Probability Paolo Bautista September 12, 2017 Set Theory We start by defining terms in Set Theory which will be used in the following sections. Definition 1 A set is

More information

Week 2: Probability: Counting, Sets, and Bayes

Week 2: Probability: Counting, Sets, and Bayes Statistical Methods APPM 4570/5570, STAT 4000/5000 21 Probability Introduction to EDA Week 2: Probability: Counting, Sets, and Bayes Random variable Random variable is a measurable quantity whose outcome

More information