1.7: Bayes Theorem. Jiakun Pan. Feb 4, 2019
|
|
- Elinor Wilson
- 5 years ago
- Views:
Transcription
1 1.7: Bayes Theorem Jiakun Pan Feb 4, 2019
2 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part:
3 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically.
4 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised.
5 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper.
6 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper. To diagnose cold symptoms, physicians may ask patients if they have traveled outside the U.S. within 30 days.
7 Bayesian Filtering For experiments of two parts, Bayesian filtering is an idea of predicting the second part based on the first part: programs regard incoming s with too many typos as spams, and quarantine automatically. When your FICO score goes up, your credit limit gets raised. If you drive safely for a whole year, your auto insurance policy will be cheaper. To diagnose cold symptoms, physicians may ask patients if they have traveled outside the U.S. within 30 days. Recommendation letter matters heavily in application of jobs in modern society. Etc..
8 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part.
9 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part. Then Bayesian reasoning comes to help.
10 Bayesian Reasoning However, sometimes we only know the outcome of the second part of an experiment, and want to attribute this to some reason, which is in the unobserved first part. Then Bayesian reasoning comes to help. Recall the GRE sample problem...
11 GRE Sample There are two boxes each of which contains four balls. Box A has one red ball and three black balls, while box B has two red balls and two black balls. You randomly pick a ball from them, and it turns out to be red. Question:What is the probability of the ball is from box A? Answer:1/3.
12 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of
13 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red.
14 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red. By definition, we know P(E F ) = P(E F ), P(F )
15 Solving the Sample Problem Let S be the ball is picked from the two boxes, E be event the ball is picked from box A, and F be the ball is red, then P(E F ) stands for the probability of the ball is picked from box A, provided that it is red. By definition, we know P(E F ) = P(E F ), P(F ) and the probability of the picked ball is the red ball of box A is P(E F ) = n(e F ) n(s) = 1 8, so it boils down to determine P(F ).
16 Solving the Sample Problem (Continued) By definition, P(F ) = n(f ) n(s) = 3 8.
17 Solving the Sample Problem (Continued) By definition, So the answer is P(F ) = n(f ) n(s) = 3 8. P(E F ) = 1/8 3/8 = 1 3.
18 Solving the Sample Problem (Continued) By definition, So the answer is P(F ) = n(f ) n(s) = 3 8. P(E F ) = 1/8 3/8 = 1 3. Bayes reasoning will be crucial when P(F ) can t be found easily, for example, when n(s) and n(f ) are unknown, or when probability is provided after the first step, so that you only have partial information on P(F ).
19 Sketch of a Novel A millionaire was murdered.
20 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects.
21 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects. A, maid of the victim; B, manager of the millionaire s family foundation; and C, professional assassin who killed for money.
22 Sketch of a Novel A millionaire was murdered. By the investigation of a detective, there are three suspects. A, maid of the victim; B, manager of the millionaire s family foundation; and C, professional assassin who killed for money. The detective also found the crime was committed in one blow by one person. In other words, only one of A, B, and C did try to murder, and the other two were sleeping alone at home.
23 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively.
24 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt.
25 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt. Similarly, B has 15% successful rate, and C has 95%.
26 Sketch of a Novel (Continued) Combining all information about likelihood (incentive, alibi, twitters, etc), the detective thinks the probability for A, B, and C to murder is 60%, 35%, and 5% respectively. Estimating successful rate, the detective guesses A has 5% probability to complete this move, if A made the attempt. Similarly, B has 15% successful rate, and C has 95%. Question: How suspicious is each of these suspects?
27 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful?
28 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ).
29 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, and by product rule we have that P(E 1 F ) = P(E 1 F ), (1) P(F )
30 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, P(E 1 F ) = P(E 1 F ), (1) P(F ) and by product rule we have that P(E 1 F ) = P(E 1 )P(F E 1 ) = 60% 5% = 0.03.
31 Sketch of a Novel (Continued) Let E 1, E 2, E 3 be event A, B, C tried to murder respectively, F be event the rich man died of murder. How to translate the probability of A tried to murder, given that the try turned out to be successful? P(E 1 F ). By definition, P(E 1 F ) = P(E 1 F ), (1) P(F ) and by product rule we have that P(E 1 F ) = P(E 1 )P(F E 1 ) = 60% 5% = Similarly, P(E 2 F ) = 35% 15% = , and P(E 3 F ) = 5% 95% = It remains to find P(F ).
32 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so
33 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F.
34 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap.
35 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us
36 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us P(F ) = P(E 1 F ) + P(E 2 F ) + P(E 3 F ) = = 0.13, and after applying formula (1), the answer comes out.
37 Sketch of a Novel (Continued) Recall that we assumed only one in the three tried to kill, and nobody else, so events E 1 F, E 2 F, and E 3 F make a partition of event F. Here by partition we mean a separation with no omission or overlap. The formula for mutually exclusive events tells us P(F ) = P(E 1 F ) + P(E 2 F ) + P(E 3 F ) = = 0.13, and after applying formula (1), the answer comes out. Answer: The probability of A, B, and C to commit the murder is 23.08%, 40.38%, and 36.54%, respectively.
38 The Bayes Theorem Now we can formulate the theorem.
39 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0.
40 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0. Then for any i = 1, 2,..., m, we have
41 The Bayes Theorem Now we can formulate the theorem. With sample space S and positive integer m, we assume E 1,E 2,...,E m make a partition of S, and another event F, where P(E 1 ),..., P(E m ) > 0, and P(F ) > 0. Then for any i = 1, 2,..., m, we have P(E i F ) = P(E i )P(F E i ) P(E 1 )P(F E 1 ) + P(E 2 )P(F E 2 ) P(E m )P(F E m ), in the name of Presbyterian minister Thomas Bayes ( ).
42 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section.
43 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability.
44 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better.
45 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better. Probability trees will be useful for these problems.
46 Summary Bayes theorem is not new to us, in the sense that we can derive it from the definition of conditional probability in the previous section. With this new tool, you will need to make a decision on which method to use, when being asked to find out some conditional probability. Do NOT apply this theorem regardlessly. Sometimes the straightforward method by definition works better. Probability trees will be useful for these problems. It needs more practice to build the skills of problem solving.
With Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationLecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya Resources: Kenneth Rosen, Discrete
More informationSTOR 435 Lecture 5. Conditional Probability and Independence - I
STOR 435 Lecture 5 Conditional Probability and Independence - I Jan Hannig UNC Chapel Hill 1 / 16 Motivation Basic point Think of probability as the amount of belief we have in a particular outcome. If
More informationExam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3)
1 Exam III Review Math-132 (Sections 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 8.1, 8.2, 8.3) On this exam, questions may come from any of the following topic areas: - Union and intersection of sets - Complement of
More informationLecture 04: Conditional Probability. Lisa Yan July 2, 2018
Lecture 04: Conditional Probability Lisa Yan July 2, 2018 Announcements Problem Set #1 due on Friday Gradescope submission portal up Use Piazza No class or OH on Wednesday July 4 th 2 Summary from last
More informationCSC Discrete Math I, Spring Discrete Probability
CSC 125 - Discrete Math I, Spring 2017 Discrete Probability Probability of an Event Pierre-Simon Laplace s classical theory of probability: Definition of terms: An experiment is a procedure that yields
More information2.4. Conditional Probability
2.4. Conditional Probability Objectives. Definition of conditional probability and multiplication rule Total probability Bayes Theorem Example 2.4.1. (#46 p.80 textbook) Suppose an individual is randomly
More informationConditional Probability P( )
Conditional Probability P( ) 1 conditional probability and the chain rule General defn: where P(F) > 0 Implies: P(EF) = P(E F) P(F) ( the chain rule ) General definition of Chain Rule: 2 Best of 3 tournament
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationLecture 3: Probability
Lecture 3: Probability 28th of October 2015 Lecture 3: Probability 28th of October 2015 1 / 36 Summary of previous lecture Define chance experiment, sample space and event Introduce the concept of the
More informationProbability, Entropy, and Inference / More About Inference
Probability, Entropy, and Inference / More About Inference Mário S. Alvim (msalvim@dcc.ufmg.br) Information Theory DCC-UFMG (2018/02) Mário S. Alvim (msalvim@dcc.ufmg.br) Probability, Entropy, and Inference
More informationExample. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}
Chapter 7 Notes 1 (c) Epstein, 2013 CHAPTER 7: PROBABILITY 7.1: Experiments, Sample Spaces and Events Chapter 7 Notes 2 (c) Epstein, 2013 What is the sample space for flipping a fair coin three times?
More informationMachine Learning, Fall 2009: Midterm
10-601 Machine Learning, Fall 009: Midterm Monday, November nd hours 1. Personal info: Name: Andrew account: E-mail address:. You are permitted two pages of notes and a calculator. Please turn off all
More informationFormalizing Probability. Choosing the Sample Space. Probability Measures
Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take
More information3 PROBABILITY TOPICS
Chapter 3 Probability Topics 135 3 PROBABILITY TOPICS Figure 3.1 Meteor showers are rare, but the probability of them occurring can be calculated. (credit: Navicore/flickr) Introduction It is often necessary
More informationBayes Rule: A Tutorial Introduction ( www45w9klk)
Bayes Rule: A Tutorial Introduction (https://docs.google.com/document/pub?id=1qm4hj4xmmowfvsklgcqpudojya5qcnii_ www45w9klk) by JV Stone 1 Introduction All decisions are based on data, but the best decisions
More information9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering
Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make
More informationI - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability
What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)
More informationDrama Scheme of Work Murder Mystery
Drama Scheme of Work Murder Mystery This scheme of work should last for 2-3 lessons. It allows the pupils to be part of and try to solve a murder mystery. Pupils are encouraged to improvise and develop
More informationProbability - Lecture 4
1 Introduction Probability - Lecture 4 Many methods of computation physics and the comparison of data to a mathematical representation, apply stochastic methods. These ideas were first introduced in the
More information2. Probability. Chris Piech and Mehran Sahami. Oct 2017
2. Probability Chris Piech and Mehran Sahami Oct 2017 1 Introduction It is that time in the quarter (it is still week one) when we get to talk about probability. Again we are going to build up from first
More informationIE 230 Seat # (1 point) Name (clearly) < KEY > Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited.
Closed book and notes. No calculators. Designed for 60 minutes, but time is essentially unlimited. Cover page, four pages of exam. This test covers through Section 2.7 of Montgomery and Runger, fourth
More informationProbability and Samples. Sampling. Point Estimates
Probability and Samples Sampling We want the results from our sample to be true for the population and not just the sample But our sample may or may not be representative of the population Sampling error
More informationAxioms of Probability
Sample Space (denoted by S) The set of all possible outcomes of a random experiment is called the Sample Space of the experiment, and is denoted by S. Example 1.10 If the experiment consists of tossing
More informationProblem #1 #2 #3 #4 Extra Total Points /3 /13 /7 /10 /4 /33
STAT/MATH 394 A - Autumn Quarter 206 - Midterm - October 2, 206 Name: Student ID Number: Problem # #2 #3 #4 Extra Total Points /3 /3 /7 /0 /4 /33 Read directions carefully and show all your work. Particularly,
More informationConditional Probability & Independence. Conditional Probabilities
Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F
More informationStat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory
Stat 225 Week 1, 8/20/12-8/24/12, Notes: Set Theory The Fall 2012 Stat 225 T.A.s September 7, 2012 The material in this handout is intended to cover general set theory topics. Information includes (but
More informationSolving Equations by Adding and Subtracting
SECTION 2.1 Solving Equations by Adding and Subtracting 2.1 OBJECTIVES 1. Determine whether a given number is a solution for an equation 2. Use the addition property to solve equations 3. Determine whether
More informationModule 2 : Conditional Probability
Module 2 : Conditional Probability Ruben Zamar Department of Statistics UBC January 16, 2017 Ruben Zamar Department of Statistics UBC Module () 2 January 16, 2017 1 / 61 MOTIVATION The outcome could be
More informationReview. A Bernoulli Trial is a very simple experiment:
Review A Bernoulli Trial is a very simple experiment: Review A Bernoulli Trial is a very simple experiment: two possible outcomes (success or failure) probability of success is always the same (p) the
More informationMath 2534 Solution Homework 2 sec
Math 2534 Solution Homework 2 sec. 2.1-2.2-2.3 Problem 1: Use Algebra of Logic to Prove the following: [( p q) ( p q)] heorem: [( p q) ( p q)] Pr oof : [( p q) ( p q)] q given [( p q) ( p q)] q Implication
More informationPHY 123 Lab 1 - Error and Uncertainty and the Simple Pendulum
To print higher-resolution math symbols, click the Hi-Res Fonts for Printing button on the jsmath control panel. PHY 13 Lab 1 - Error and Uncertainty and the Simple Pendulum Important: You need to print
More informationBasic Probabilistic Reasoning SEG
Basic Probabilistic Reasoning SEG 7450 1 Introduction Reasoning under uncertainty using probability theory Dealing with uncertainty is one of the main advantages of an expert system over a simple decision
More informationExamination Artificial Intelligence Module Intelligent Interaction Design December 2014
Examination Artificial Intelligence Module Intelligent Interaction Design December 2014 Introduction This exam is closed book, you may only use a simple calculator (addition, substraction, multiplication
More informationDiscussion 01. b) What is the probability that the letter selected is a vowel?
STAT 400 Discussion 01 Spring 2018 1. Consider the following experiment: A letter is chosen at random from the word STATISTICS. a) List all possible outcomes and their probabilities. b) What is the probability
More informationPROBABILITY: CONDITIONING, BAYES THEOREM [DEVORE 2.4]
PROBABILITY: CONDITIONING, BAYES THEOREM [DEVORE 2.4] CONDITIONAL PROBABILITY: Let events E, F be events in the sample space Ω of an experiment. Then: The conditional probability of F given E, P(F E),
More informationThe Norwood Builder and Other Stories
The Norwood Builder and Other Stories Sir Arthur Conan Doyle A Before Reading 1 a What do you already know about Sherlock Holmes, the main character in the stories? Complete the table. Sherlock Holmes
More informationP (E) = P (A 1 )P (A 2 )... P (A n ).
Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer
More informationExam 1 Review With Solutions Instructor: Brian Powers
Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?
More informationMidterm. Introduction to Machine Learning. CS 189 Spring Please do not open the exam before you are instructed to do so.
CS 89 Spring 07 Introduction to Machine Learning Midterm Please do not open the exam before you are instructed to do so. The exam is closed book, closed notes except your one-page cheat sheet. Electronic
More informationIndependence, Concentration, Bayes Theorem
Independence, Concentration, Bayes Theorem CSE21 Winter 2017, Day 23 (B00), Day 15 (A00) March 10, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17 Other functions? Expectation does not in general commute
More informationToday. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?
Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates
More informationIndependence, Variance, Bayes Theorem
Independence, Variance, Bayes Theorem Russell Impagliazzo and Miles Jones Thanks to Janine Tiefenbruck http://cseweb.ucsd.edu/classes/sp16/cse21-bd/ May 16, 2016 Resolving collisions with chaining Hash
More information1 Probabilities. 1.1 Basics 1 PROBABILITIES
1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability
More informationBayesian Updating: Discrete Priors: Spring
Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:
More informationMAE 493G, CpE 493M, Mobile Robotics. 6. Basic Probability
MAE 493G, CpE 493M, Mobile Robotics 6. Basic Probability Instructor: Yu Gu, Fall 2013 Uncertainties in Robotics Robot environments are inherently unpredictable; Sensors and data acquisition systems are
More informationMath 10 - Compilation of Sample Exam Questions + Answers
Math 10 - Compilation of Sample Exam Questions + Sample Exam Question 1 We have a population of size N. Let p be the independent probability of a person in the population developing a disease. Answer the
More informationEXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 1996
EXAMINATIONS OF THE ROAL STATISTICAL SOCIET (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 996 Paper I : Statistical Theory Time Allowed: Three Hours Candidates
More informationMATH 556: PROBABILITY PRIMER
MATH 6: PROBABILITY PRIMER 1 DEFINITIONS, TERMINOLOGY, NOTATION 1.1 EVENTS AND THE SAMPLE SPACE Definition 1.1 An experiment is a one-off or repeatable process or procedure for which (a there is a well-defined
More informationCS 453 Operating Systems. Lecture 7 : Deadlock
CS 453 Operating Systems Lecture 7 : Deadlock 1 What is Deadlock? Every New Yorker knows what a gridlock alert is - it s one of those days when there is so much traffic that nobody can move. Everything
More informationElementary Discrete Probability
Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,
More informationProbabilities and Expectations
Probabilities and Expectations Ashique Rupam Mahmood September 9, 2015 Probabilities tell us about the likelihood of an event in numbers. If an event is certain to occur, such as sunrise, probability of
More informationCS1512 Foundations of Computing Science 2. Lecture 4
CS1512 Foundations of Computing Science 2 Lecture 4 Bayes Law; Gaussian Distributions 1 J R W Hunter, 2006; C J van Deemter 2007 (Revd. Thomas) Bayes Theorem P( E 1 and E 2 ) = P( E 1 )* P( E 2 E 1 ) Order
More informationDiscrete Probability
MAT 258 Discrete Mathematics Discrete Probability Kenneth H. Rosen and Kamala Krithivasan Discrete Mathematics 7E Global Edition Chapter 7 Reproduced without explicit consent Fall 2016 Week 11 Probability
More informationLecture 2. Conditional Probability
Math 408 - Mathematical Statistics Lecture 2. Conditional Probability January 18, 2013 Konstantin Zuev (USC) Math 408, Lecture 2 January 18, 2013 1 / 9 Agenda Motivation and Definition Properties of Conditional
More informationTest One Mathematics Fall 2009
Test One Mathematics 35.2 Fall 29 TO GET FULL CREDIT YOU MUST SHOW ALL WORK! I have neither given nor received aid in the completion of this test. Signature: pts. 2 pts. 3 5 pts. 2 pts. 5 pts. 6(i) pts.
More informationConditional probability
CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will
More informationChapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University
Chapter 3 Conditional Probability and Independence Wen-Guey Tzeng Computer Science Department National Chiao Tung University Conditional probability P(A B) = the probability of event A given the occurrence
More informationChapter Six. Approaches to Assigning Probabilities
Chapter Six Probability 6.1 Approaches to Assigning Probabilities There are three ways to assign a probability, P(O i ), to an outcome, O i, namely: Classical approach: based on equally likely events.
More informationChapter 3 Conditional Probability and Independence. Wen-Guey Tzeng Computer Science Department National Chiao Tung University
Chapter 3 Conditional Probability and Independence Wen-Guey Tzeng Computer Science Department National Chiao Tung University Conditional probability P(A B) = the probability of event A given the occurrence
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Final Exam Review Session Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Notes 140608 Review Things to Know has
More informationCS 188: Artificial Intelligence Spring Today
CS 188: Artificial Intelligence Spring 2006 Lecture 9: Naïve Bayes 2/14/2006 Dan Klein UC Berkeley Many slides from either Stuart Russell or Andrew Moore Bayes rule Today Expectations and utilities Naïve
More informationJoint, Conditional, & Marginal Probabilities
Joint, Conditional, & Marginal Probabilities Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Joint, Conditional, & Marginal Probabilities The three axioms for probability don t discuss how
More informationIntroduction to Probability Theory
Introduction to Probability Theory Overview The concept of probability is commonly used in everyday life, and can be expressed in many ways. For example, there is a 50:50 chance of a head when a fair coin
More informationBayesian Updating: Discrete Priors: Spring
Bayesian Updating: Discrete Priors: 18.05 Spring 2017 http://xkcd.com/1236/ Learning from experience Which treatment would you choose? 1. Treatment 1: cured 100% of patients in a trial. 2. Treatment 2:
More informationTime: 1 hour 30 minutes
Paper Reference(s) 668/0 Edexcel GCE Statistics S Silver Level S2 Time: hour 0 minutes Materials required for examination papers Mathematical Formulae (Green) Items included with question Nil Candidates
More informationLecture 3 : Probability II. Jonathan Marchini
Lecture 3 : Probability II Jonathan Marchini Puzzle 1 Pick any two types of card that can occur in a normal pack of shuffled playing cards e.g. Queen and 6. What do you think is the probability that somewhere
More informationSTAT 516 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS. = {(a 1, a 2,...) : a i < 6 for all i}
STAT 56 Answers Homework 2 January 23, 2008 Solutions by Mark Daniel Ward PROBLEMS 2. We note that E n consists of rolls that end in 6, namely, experiments of the form (a, a 2,...,a n, 6 for n and a i
More informationBe able to define the following terms and answer basic questions about them:
CS440/ECE448 Section Q Fall 2017 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables, axioms of probability o Joint, marginal, conditional
More informationBinomial Probability. Permutations and Combinations. Review. History Note. Discuss Quizzes/Answer Questions. 9.0 Lesson Plan
9.0 Lesson Plan Discuss Quizzes/Answer Questions History Note Review Permutations and Combinations Binomial Probability 1 9.1 History Note Pascal and Fermat laid out the basic rules of probability in a
More information4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo
4. Conditional Probability P( ) CSE 312 Autumn 2012 W.L. Ruzzo 1 conditional probability Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F S Written
More informationChapter 3 : Conditional Probability and Independence
STAT/MATH 394 A - PROBABILITY I UW Autumn Quarter 2016 Néhémy Lim Chapter 3 : Conditional Probability and Independence 1 Conditional Probabilities How should we modify the probability of an event when
More informationMore on conditioning and Mr. Bayes
More on conditioning and Mr. Bayes Saad Mneimneh 1 Multiplication rule for conditioning We can generalize the formula P(A,B) P(A B)P(B) to more than two events. For instance, P(A,B,C) P(A)P(B A)P(C A,B).
More informationCS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2008 Lecture 23: Perceptrons 11/20/2008 Dan Klein UC Berkeley 1 General Naïve Bayes A general naive Bayes model: C E 1 E 2 E n We only specify how each feature depends
More informationGeneral Naïve Bayes. CS 188: Artificial Intelligence Fall Example: Overfitting. Example: OCR. Example: Spam Filtering. Example: Spam Filtering
CS 188: Artificial Intelligence Fall 2008 General Naïve Bayes A general naive Bayes model: C Lecture 23: Perceptrons 11/20/2008 E 1 E 2 E n Dan Klein UC Berkeley We only specify how each feature depends
More informationTopic 5: Probability. 5.4 Combined Events and Conditional Probability Paper 1
Topic 5: Probability Standard Level 5.4 Combined Events and Conditional Probability Paper 1 1. In a group of 16 students, 12 take art and 8 take music. One student takes neither art nor music. The Venn
More information( ) P A B : Probability of A given B. Probability that A happens
A B A or B One or the other or both occurs At least one of A or B occurs Probability Review A B A and B Both A and B occur ( ) P A B : Probability of A given B. Probability that A happens given that B
More informationSome Concepts of Probability (Review) Volker Tresp Summer 2018
Some Concepts of Probability (Review) Volker Tresp Summer 2018 1 Definition There are different way to define what a probability stands for Mathematically, the most rigorous definition is based on Kolmogorov
More informationLecture 2: Bayesian Classification
Lecture 2: Bayesian Classification 1 Content Reminders from previous lecture Historical note Conditional probability Bayes theorem Bayesian classifiers Example 1: Marketing promotions Example 2: Items
More informationUse Newton s law of cooling to narrow down the number of suspects by determining when the victim was killed.
Case File 14 Hot Air, Cold Body: Using Newton s law of cooling to determine time of death Use Newton s law of cooling to narrow down the number of suspects by determining when the victim was killed. Memo
More informationIntroduction to Probability 2017/18 Supplementary Problems
Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A
More informationOverview. Overview. Overview. Specific Examples. General Examples. Bivariate Regression & Correlation
Bivariate Regression & Correlation Overview The Scatter Diagram Two Examples: Education & Prestige Correlation Coefficient Bivariate Linear Regression Line SPSS Output Interpretation Covariance ou already
More informationA Study On Problem Solving Using Bayes Theorem
vailable at https://edupediapublications.org/journals Volume 03 Issue 4 October06 Study On roblem Solving Using ayes Theorem Ismael Yaseen bdulridha lasadi M.Sc, pplied Mathematics University College of
More informationBe able to define the following terms and answer basic questions about them:
CS440/ECE448 Fall 2016 Final Review Be able to define the following terms and answer basic questions about them: Probability o Random variables o Axioms of probability o Joint, marginal, conditional probability
More informationProbability Theory and Applications
Probability Theory and Applications Videos of the topics covered in this manual are available at the following links: Lesson 4 Probability I http://faculty.citadel.edu/silver/ba205/online course/lesson
More informationIntroduction to probability
Introduction to probability 4.1 The Basics of Probability Probability The chance that a particular event will occur The probability value will be in the range 0 to 1 Experiment A process that produces
More informationProbability and (Bayesian) Data Analysis
Department of Statistics The University of Auckland https://www.stat.auckland.ac.nz/ brewer/ Where to get everything To get all of the material (slides, code, exercises): git clone --recursive https://github.com/eggplantbren/madrid
More informationTOPIC 12 PROBABILITY SCHEMATIC DIAGRAM
TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM Topic Concepts Degree of Importance References NCERT Book Vol. II Probability (i) Conditional Probability *** Article 1.2 and 1.2.1 Solved Examples 1 to 6 Q. Nos
More informationSTAT Chapter 3: Probability
Basic Definitions STAT 515 --- Chapter 3: Probability Experiment: A process which leads to a single outcome (called a sample point) that cannot be predicted with certainty. Sample Space (of an experiment):
More informationLecture 9: Naive Bayes, SVM, Kernels. Saravanan Thirumuruganathan
Lecture 9: Naive Bayes, SVM, Kernels Instructor: Outline 1 Probability basics 2 Probabilistic Interpretation of Classification 3 Bayesian Classifiers, Naive Bayes 4 Support Vector Machines Probability
More informationCS626 Data Analysis and Simulation
CS626 Data Analysis and Simulation Instructor: Peter Kemper R 104A, phone 221-3462, email:kemper@cs.wm.edu Today: Probability Primer Quick Reference: Sheldon Ross: Introduction to Probability Models 9th
More informationReading for Lecture 6 Release v10
Reading for Lecture 6 Release v10 Christopher Lee October 11, 2011 Contents 1 The Basics ii 1.1 What is a Hypothesis Test?........................................ ii Example..................................................
More informationHonors Algebra 2 Final Exam 2002
Honors Algebra 2 Final Exam 2002 Name PART A. MULTIPLE CHOICE. Circle the letter in front of each correct answer. You do not have to show work. There is no partial credit. EACH PROBLEM IN THIS SECTION
More information3 Conditional Probability
3 Conditional Probability Question: What are the chances that a college student chosen at random from the U.S. population is a fan of the Notre Dame football team? Now, if the person chosen is a student
More informationSection 13.3 Probability
288 Section 13.3 Probability Probability is a measure of how likely an event will occur. When the weather forecaster says that there will be a 50% chance of rain this afternoon, the probability that it
More informationProbability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides
Probability Review Lecturer: Ji Liu Thank Jerry Zhu for sharing his slides slide 1 Inference with Bayes rule: Example In a bag there are two envelopes one has a red ball (worth $100) and a black ball one
More informationEntropy. Expected Surprise
Entropy Let X be a discrete random variable The surprise of observing X = x is defined as log 2 P(X=x) Surprise of probability 1 is zero. Surprise of probability 0 is (c) 200 Thomas G. Dietterich 1 Expected
More informationStatistical Theory 1
Statistical Theory 1 Set Theory and Probability Paolo Bautista September 12, 2017 Set Theory We start by defining terms in Set Theory which will be used in the following sections. Definition 1 A set is
More informationWeek 2: Probability: Counting, Sets, and Bayes
Statistical Methods APPM 4570/5570, STAT 4000/5000 21 Probability Introduction to EDA Week 2: Probability: Counting, Sets, and Bayes Random variable Random variable is a measurable quantity whose outcome
More information