CSE 21 Math for Algorithms and Systems Analysis. Lecture 10 Condi<onal Probability

Size: px
Start display at page:

Download "CSE 21 Math for Algorithms and Systems Analysis. Lecture 10 Condi<onal Probability"

Transcription

1 CSE 21 Math for Algorithms and Systems Analysis Lecture 10 Condi<onal Probability

2 Outline Review of defini<ons of probability Condi<onal Probability Decision Trees and Probability Intro to Bayes Rule

3 Probability Defini<on A probability space is given by two things: A set, U, called the sample space (think of this as all possible things of interest that can occur) A probability func<on, f, that specifies how likely each of the elements in U is to occur. In order to be a valid probability space we require that: x U f(x) = 1 and 0 f(x) 1 x U

4 Probability of an Event We define an event as a subset of the elements in the set U Suppose the set E is an event ( E U ) The probability of the event E occurring is: P (E) = x E f(x)

5 Probability Sample Problems Consider drawing a 5- card hand from a deck of cards Compute the probability of: GeVng at least 2 queens GeVng a full house GeVng a royal flush

6 Probability and Coun<ng the Complement P (E) =1 P (E) Example Birthday Problem What is the probability that out of a class of n people at least two people share the same birthday (my apologies to anyone born on February 29 th!) P (E) =1 365 n n! 365 n The probability that at least two people in this class share a birthday is:

7 Another Component of the Birthday Problem Suppose we have a class of 62 people. Suppose I start from the first person and ask them to state their birthday Then I ask if anyone in the class shares the same birthday What is the probability that we iden<fy a common birthday within n people

8 Workspace

9 Probabili<es for some Values of n Probability a^er 1 selec<on Probability a^er 2 selec<ons Probability a^er 3 selec<ons Probability a^er 4 selec<ons Probability a^er 5 selec<ons Probability a^er 6 selec<ons Probability a^er 7 selec<ons Probability a^er 8 selec<ons Probability a^er 20 selec<ons

10 Binomial Probability Distribu<on Suppose we perform a sequence of n trials. The probability of a success on each trial is q. Each trial is independent. The probability of achieving exactly k successes in n trials is: P (k successes) = n q k (1 q) n k k

11 Probability and Decision Trees Suppose we observe San Diego weather over a period of 3 days. What is the probability that it rains exactly two of the days? Assume (for now) that the probability of it raining on any par<cular day is 1/10 and that the event of it raining on any par<cular day is independent of it raining on any other day Compute the probability of it raining exactly two of the days

12 Workspace

13 Condi<onal Probability Here is another model of San Diego weather. Let R i be the event that it rains on day i and S i be the event that it is sunny on day i. P(R i ) = 1/10 P(R i R i- 1 ) = 7/10 Read as Given P(S i S i- 1 ) = 95/100 Read as Given

14 Defini<on of Condi<onal Probability U P (A B) = P (A B) P (B) A B Given f(x) = 1/15 for all x. What is P(A B)?

15 Condi<onal Probability and Condi<onal Independence Recall our defini<on of the independence of two events A and B. We say A and B are independent events if and only if Two events A and B are condi<onally independent given a third event C if and only if Equivalently: P (A B) =P (A)P (B) P (A B C) =P (A C)P (B C) P (A B C) =P (A C)

16 Back to our Example About The Weather in San Diego Suppose we want to know the probability of it raining on 2 out of three days in San Diego Recall our model of the weather P(R i ) = 1/10 P(R i R i- 1 ) = 7/10 P(S i S i- 1 ) = 95/100 Addi<onally suppose R i is condi<onally independent of R 1 R i- 2 given R i- 1

17 Represen<ng Condi<onal Probabili<es Using Decision Trees P(S 1 ) =.9 P(R 1 ) =.1 Day 1 P(S 2 S 1 ) =.95 S P(R 2 S 1 ) =.05 P(S 2 R 1 ) = 3/10 R P(R 2 R 1 ) = 7/10 Day 2 S R S R P (S 3 S 2 S 1 )= Day 3 S R S R S R S R

18 Decision Trees and The Rule of Product The rule of product states that P (A 1 A 2... A n )=P (A 1 ) P (A 2 A 1 ) P (A 3 A 1 A 2 )... P (A n A 1... A n 1 ) Remember the commuta<ve property of set intersec<on. What does this tell us about the preceding rule?

19 Decision Trees and the Rule of Product P(S 1 ) =.9 P(R 1 ) =.1 Day 1 P(S 2 S 1 ) =.95 S P(S 2 R 1 ) = 3/10 R Day 2 S P (R 3 S 1 S 2 )=.05 P (R 3 R 1 S 2 )=.05 S Day 3 R R P (S 1 R 2 R 3 )= P (R 1 S 2 R 3 )=

20 Sample Problem: What is the Probability that it rains exactly 2 out of 3 days?

21 Sample Problem: What is the Probability that it rains on day 3?

22 More Examples of Probability and Decision Trees Suppose an Urn is filled with 3 green marbles, 2 blue marbles, and 1 red marble A marble is selected If it is red, it is not put back in the urn If it is blue, it is put back in the urn along with 3 addi<onal blue marbles If it is green, it is put back in the urn along with 5 addi<onal blue marbles A second marble is selected What is the probability that both marbles are green?

23 Workspace

24 Workspace

25 Intro to Bayes Rule Bayes rule can be easily derived from the product rule P (A B) =P (A)P (B A) P (A B) =P (B A) =P (B)P (A B) P (A B) = P (A)P (B A) P (B)

26 More Condi<onal Probability Problems Suppose there are two type of animals in the world cats and dogs P(dog) =.7 P(cat) =.3 P(animal weighs over 40 pounds cat) =.001 P(animal weighs over 40 pounds dog) =.5 P(animal has a tail over 6 inches cat) =.99 P(animal has a tail over 6 inches dog) =.8 Suppose the tail length and weight are condi<onally independent given the iden<ty of the animal Compute P(dog animal weighs over 40 pounds and has a tail over 6 inches)

27 Workspace

28 Example from Machine Learning (not officially part of the class) Consider the problem of dis<nguishing smiles from non smiles

29 What is an image? An image is a collec<on of pixels. Each number specifies the brightness of the pixel at a par<cular loca<on E.g

30 Goal Our goal will be predict whether a person is smiling or not given an image of their face P(smile pixels) By Bayes rule: P (smile pixels) = P (pixels smile)p (smile) P (pixels) Our method of predic<ng will be to always predict the most likely category. So will predict a smile if P (smile pixels) P (nosmile pixels) > 1

31 Predic<on Rule A^er Applying Bayes rule we predict smile when: P (smile pixels)p (smile) P (pixels) P (nosmile pixels)p (nosmile) P (pixels) = P (smile pixels)p (smile) P (nosmile pixels)p (nosmile) > 1

32 How do we apply this rule We need to build a model of the pixels given each of the categories Without going into too much detail about how this is done, the simplest method is to create a Naïve Bayes classifier. Here we assume that the probability of the pixel brightness at each any two loca<ons are condi<onally independent given the category (smile / not smile)

33 An Example of Probability of a Pixel

34 An Example of Probability of a Pixel

35 Now I can visualize how good each pixel is at predic<ng the expression

CSE 21 Math for Algorithms and Systems Analysis. Lecture 11 Bayes Rule and Random Variables

CSE 21 Math for Algorithms and Systems Analysis. Lecture 11 Bayes Rule and Random Variables CSE 21 Math for Algorithms and Systems Analysis Lecture 11 Bayes Rule and Random Variables Outline Review of CondiConal Probability Bayes Rule Random Variables DefiniCon of CondiConal Probability U P (A

More information

Generative Learning. INFO-4604, Applied Machine Learning University of Colorado Boulder. November 29, 2018 Prof. Michael Paul

Generative Learning. INFO-4604, Applied Machine Learning University of Colorado Boulder. November 29, 2018 Prof. Michael Paul Generative Learning INFO-4604, Applied Machine Learning University of Colorado Boulder November 29, 2018 Prof. Michael Paul Generative vs Discriminative The classification algorithms we have seen so far

More information

W3203 Discrete Mathema1cs. Coun1ng. Spring 2015 Instructor: Ilia Vovsha.

W3203 Discrete Mathema1cs. Coun1ng. Spring 2015 Instructor: Ilia Vovsha. W3203 Discrete Mathema1cs Coun1ng Spring 2015 Instructor: Ilia Vovsha h@p://www.cs.columbia.edu/~vovsha/w3203 Outline Bijec1on rule Sum, product, division rules Permuta1ons and combina1ons Sequences with

More information

CS 6140: Machine Learning Spring What We Learned Last Week. Survey 2/26/16. VS. Model

CS 6140: Machine Learning Spring What We Learned Last Week. Survey 2/26/16. VS. Model Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Assignment

More information

CS 6140: Machine Learning Spring 2016

CS 6140: Machine Learning Spring 2016 CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa?on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Logis?cs Assignment

More information

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio 4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio Wrong is right. Thelonious Monk 4.1 Three Definitions of

More information

5.5 PROBABILITY AS A THEORETICAL CONCEPT

5.5 PROBABILITY AS A THEORETICAL CONCEPT 5.5 PROAILIY AS A EOREICAL CONCEP So far, we have solved probability problems by estimating the required probability after conducting some sort of experiment and collecting data. ut probability may be

More information

Machine Learning

Machine Learning Machine Learning 10-601 Tom M. Mitchell Machine Learning Department Carnegie Mellon University August 30, 2017 Today: Decision trees Overfitting The Big Picture Coming soon Probabilistic learning MLE,

More information

Slide 1 Math 1520, Lecture 21

Slide 1 Math 1520, Lecture 21 Slide 1 Math 1520, Lecture 21 This lecture is concerned with a posteriori probability, which is the probability that a previous event had occurred given the outcome of a later event. Slide 2 Conditional

More information

Combinatorics Sec/on of Rosen Ques/ons

Combinatorics Sec/on of Rosen Ques/ons Combinatorics Sec/on 5.1 5.6 7.5 7.6 of Rosen Spring 2011 CSCE 235 Introduc5on to Discrete Structures Course web- page: cse.unl.edu/~cse235 Ques/ons: cse235@cse.unl.edu Mo5va5on Combinatorics is the study

More information

P (E) = P (A 1 )P (A 2 )... P (A n ).

P (E) = P (A 1 )P (A 2 )... P (A n ). Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer

More information

Rules of Probability

Rules of Probability Rules of Probability The additive rule P[A B] = P[A] + P[B] P[A B] and P[A B] = P[A] + P[B] if P[A B] = f The additive rule for more than two events n n P A i P Ai P Ai A j i1 i1 i j P A A A i j k i j

More information

11. Probability Sample Spaces and Probability

11. Probability Sample Spaces and Probability 11. Probability 11.1 Sample Spaces and Probability 1 Objectives A. Find the probability of an event. B. Find the empirical probability of an event. 2 Theoretical Probabilities 3 Example A fair coin is

More information

Chapter. Probability

Chapter. Probability Chapter 3 Probability Section 3.1 Basic Concepts of Probability Section 3.1 Objectives Identify the sample space of a probability experiment Identify simple events Use the Fundamental Counting Principle

More information

Conditional Probability

Conditional Probability Conditional Probability Sometimes our computation of the probability of an event is changed by the knowledge that a related event has occurred (or is guaranteed to occur) or by some additional conditions

More information

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability What is Probability? the chance of an event occuring eg 1classical probability 2empirical probability 3subjective probability Section 2 - Probability (1) Probability - Terminology random (probability)

More information

Descriptive Statistics. Population. Sample

Descriptive Statistics. Population. Sample Sta$s$cs Data (sing., datum) observa$ons (such as measurements, counts, survey responses) that have been collected. Sta$s$cs a collec$on of methods for planning experiments, obtaining data, and then then

More information

PROBABILITY. Contents Preface 1 1. Introduction 2 2. Combinatorial analysis 5 3. Stirling s formula 8. Preface

PROBABILITY. Contents Preface 1 1. Introduction 2 2. Combinatorial analysis 5 3. Stirling s formula 8. Preface PROBABILITY VITTORIA SILVESTRI Contents Preface. Introduction. Combinatorial analysis 5 3. Stirling s formula 8 Preface These lecture notes are for the course Probability IA, given in Lent 09 at the University

More information

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary) Chapter 14 From Randomness to Probability How to measure a likelihood of an event? How likely is it to answer correctly one out of two true-false questions on a quiz? Is it more, less, or equally likely

More information

Name: Exam 2 Solutions. March 13, 2017

Name: Exam 2 Solutions. March 13, 2017 Department of Mathematics University of Notre Dame Math 00 Finite Math Spring 07 Name: Instructors: Conant/Galvin Exam Solutions March, 07 This exam is in two parts on pages and contains problems worth

More information

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial. Section 8.6: Bernoulli Experiments and Binomial Distribution We have already learned how to solve problems such as if a person randomly guesses the answers to 10 multiple choice questions, what is the

More information

Linear Classifiers: Expressiveness

Linear Classifiers: Expressiveness Linear Classifiers: Expressiveness Machine Learning Spring 2018 The slides are mainly from Vivek Srikumar 1 Lecture outline Linear classifiers: Introduction What functions do linear classifiers express?

More information

Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur

Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur Data Mining Prof. Pabitra Mitra Department of Computer Science & Engineering Indian Institute of Technology, Kharagpur Lecture 21 K - Nearest Neighbor V In this lecture we discuss; how do we evaluate the

More information

4. Probability of an event A for equally likely outcomes:

4. Probability of an event A for equally likely outcomes: University of California, Los Angeles Department of Statistics Statistics 110A Instructor: Nicolas Christou Probability Probability: A measure of the chance that something will occur. 1. Random experiment:

More information

Graphical Models. Lecture 3: Local Condi6onal Probability Distribu6ons. Andrew McCallum

Graphical Models. Lecture 3: Local Condi6onal Probability Distribu6ons. Andrew McCallum Graphical Models Lecture 3: Local Condi6onal Probability Distribu6ons Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Condi6onal Probability Distribu6ons

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

Today we ll discuss ways to learn how to think about events that are influenced by chance.

Today we ll discuss ways to learn how to think about events that are influenced by chance. Overview Today we ll discuss ways to learn how to think about events that are influenced by chance. Basic probability: cards, coins and dice Definitions and rules: mutually exclusive events and independent

More information

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use?

Today. Statistical Learning. Coin Flip. Coin Flip. Experiment 1: Heads. Experiment 1: Heads. Which coin will I use? Which coin will I use? Today Statistical Learning Parameter Estimation: Maximum Likelihood (ML) Maximum A Posteriori (MAP) Bayesian Continuous case Learning Parameters for a Bayesian Network Naive Bayes Maximum Likelihood estimates

More information

Bayesian Learning. Artificial Intelligence Programming. 15-0: Learning vs. Deduction

Bayesian Learning. Artificial Intelligence Programming. 15-0: Learning vs. Deduction 15-0: Learning vs. Deduction Artificial Intelligence Programming Bayesian Learning Chris Brooks Department of Computer Science University of San Francisco So far, we ve seen two types of reasoning: Deductive

More information

Bayesian networks Lecture 18. David Sontag New York University

Bayesian networks Lecture 18. David Sontag New York University Bayesian networks Lecture 18 David Sontag New York University Outline for today Modeling sequen&al data (e.g., =me series, speech processing) using hidden Markov models (HMMs) Bayesian networks Independence

More information

UVA CS / Introduc8on to Machine Learning and Data Mining

UVA CS / Introduc8on to Machine Learning and Data Mining UVA CS 4501-001 / 6501 007 Introduc8on to Machine Learning and Data Mining Lecture 13: Probability and Sta3s3cs Review (cont.) + Naïve Bayes Classifier Yanjun Qi / Jane, PhD University of Virginia Department

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter One Jesse Crawford Department of Mathematics Tarleton State University (Tarleton State University) Chapter One Notes 1 / 71 Outline 1 A Sketch of Probability and

More information

Statistics for Business and Economics

Statistics for Business and Economics Statistics for Business and Economics Basic Probability Learning Objectives In this lecture(s), you learn: Basic probability concepts Conditional probability To use Bayes Theorem to revise probabilities

More information

MATH MW Elementary Probability Course Notes Part I: Models and Counting

MATH MW Elementary Probability Course Notes Part I: Models and Counting MATH 2030 3.00MW Elementary Probability Course Notes Part I: Models and Counting Tom Salisbury salt@yorku.ca York University Winter 2010 Introduction [Jan 5] Probability: the mathematics used for Statistics

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

CS 6140: Machine Learning Spring What We Learned Last Week 2/26/16

CS 6140: Machine Learning Spring What We Learned Last Week 2/26/16 Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Sign

More information

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013 Name: Student ID: CSE 21A Midterm #2 February 28, 2013 There are 6 problems. The number of points a problem is worth is shown next to the problem. Show your work (even on multiple choice questions)! Also,

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

Discrete Structures Homework 1

Discrete Structures Homework 1 Discrete Structures Homework 1 Due: June 15. Section 1.1 16 Determine whether these biconditionals are true or false. a) 2 + 2 = 4 if and only if 1 + 1 = 2 b) 1 + 1 = 2 if and only if 2 + 3 = 4 c) 1 +

More information

PROBABILITY VITTORIA SILVESTRI

PROBABILITY VITTORIA SILVESTRI PROBABILITY VITTORIA SILVESTRI Contents Preface. Introduction 2 2. Combinatorial analysis 5 3. Stirling s formula 8 4. Properties of Probability measures Preface These lecture notes are for the course

More information

Math/Stat 3850 Exam 1

Math/Stat 3850 Exam 1 2/21/18 Name: Math/Stat 3850 Exam 1 There are 10 questions, worth a total of 100 points. You may use R, your calculator, and any written or internet resources on this test, although you are not allowed

More information

Machine Learning and Data Mining. Bayes Classifiers. Prof. Alexander Ihler

Machine Learning and Data Mining. Bayes Classifiers. Prof. Alexander Ihler + Machine Learning and Data Mining Bayes Classifiers Prof. Alexander Ihler A basic classifier Training data D={x (i),y (i) }, Classifier f(x ; D) Discrete feature vector x f(x ; D) is a con@ngency table

More information

Probability 5-4 The Multiplication Rules and Conditional Probability

Probability 5-4 The Multiplication Rules and Conditional Probability Outline Lecture 8 5-1 Introduction 5-2 Sample Spaces and 5-3 The Addition Rules for 5-4 The Multiplication Rules and Conditional 5-11 Introduction 5-11 Introduction as a general concept can be defined

More information

DART Tutorial Sec'on 1: Filtering For a One Variable System

DART Tutorial Sec'on 1: Filtering For a One Variable System DART Tutorial Sec'on 1: Filtering For a One Variable System UCAR The Na'onal Center for Atmospheric Research is sponsored by the Na'onal Science Founda'on. Any opinions, findings and conclusions or recommenda'ons

More information

MATH 556: PROBABILITY PRIMER

MATH 556: PROBABILITY PRIMER MATH 6: PROBABILITY PRIMER 1 DEFINITIONS, TERMINOLOGY, NOTATION 1.1 EVENTS AND THE SAMPLE SPACE Definition 1.1 An experiment is a one-off or repeatable process or procedure for which (a there is a well-defined

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

18.600: Lecture 3 What is probability?

18.600: Lecture 3 What is probability? 18.600: Lecture 3 What is probability? Scott Sheffield MIT Outline Formalizing probability Sample space DeMorgan s laws Axioms of probability Outline Formalizing probability Sample space DeMorgan s laws

More information

Data Mining. Practical Machine Learning Tools and Techniques. Slides for Chapter 4 of Data Mining by I. H. Witten, E. Frank and M. A.

Data Mining. Practical Machine Learning Tools and Techniques. Slides for Chapter 4 of Data Mining by I. H. Witten, E. Frank and M. A. Data Mining Practical Machine Learning Tools and Techniques Slides for Chapter of Data Mining by I. H. Witten, E. Frank and M. A. Hall Statistical modeling Opposite of R: use all the attributes Two assumptions:

More information

6.8 The Pigeonhole Principle

6.8 The Pigeonhole Principle 6.8 The Pigeonhole Principle Getting Started Are there two leaf-bearing trees on Earth with the same number of leaves if we only consider the number of leaves on a tree at full bloom? Getting Started Are

More information

Conditional probability

Conditional probability CHAPTER 4 Conditional probability 4.1. Introduction Suppose there are 200 men, of which 100 are smokers, and 100 women, of which 20 are smokers. What is the probability that a person chosen at random will

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Statistical Methods for the Social Sciences, Autumn 2012

Statistical Methods for the Social Sciences, Autumn 2012 Statistical Methods for the Social Sciences, Autumn 2012 Review Session 3: Probability. Exercises Ch.4. More on Stata TA: Anastasia Aladysheva anastasia.aladysheva@graduateinstitute.ch Office hours: Mon

More information

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Managers Using Microsoft Excel (3 rd Edition) Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts

More information

Math 20 Spring Discrete Probability. Midterm Exam

Math 20 Spring Discrete Probability. Midterm Exam Math 20 Spring 203 Discrete Probability Midterm Exam Thursday April 25, 5:00 7:00 PM Your name (please print): Instructions: This is a closed book, closed notes exam. Use of calculators is not permitted.

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 3 Probability Contents 1. Events, Sample Spaces, and Probability 2. Unions and Intersections 3. Complementary Events 4. The Additive Rule and Mutually Exclusive

More information

= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later.

= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later. PROBABILITY MATH CIRCLE (ADVANCED /27/203 The likelyhood of something (usually called an event happening is called the probability. Probability (informal: We can calculate probability using a ratio: want

More information

Machine Learning

Machine Learning Machine Learning 10-701 Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 13, 2011 Today: The Big Picture Overfitting Review: probability Readings: Decision trees, overfiting

More information

7.1 What is it and why should we care?

7.1 What is it and why should we care? Chapter 7 Probability In this section, we go over some simple concepts from probability theory. We integrate these with ideas from formal language theory in the next chapter. 7.1 What is it and why should

More information

Discrete Probability

Discrete Probability Discrete Probability Counting Permutations Combinations r- Combinations r- Combinations with repetition Allowed Pascal s Formula Binomial Theorem Conditional Probability Baye s Formula Independent Events

More information

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II Week 4 Classical Probability, Part II Week 4 Objectives This week we continue covering topics from classical probability. The notion of conditional probability is presented first. Important results/tools

More information

Lecture 3: Probability

Lecture 3: Probability Lecture 3: Probability 28th of October 2015 Lecture 3: Probability 28th of October 2015 1 / 36 Summary of previous lecture Define chance experiment, sample space and event Introduce the concept of the

More information

= A. Example 2. Let U = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10}, A = {4, 6, 7, 9, 10}, and B = {2, 6, 8, 9}. Draw the sets on a Venn diagram.

= A. Example 2. Let U = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10}, A = {4, 6, 7, 9, 10}, and B = {2, 6, 8, 9}. Draw the sets on a Venn diagram. MATH 109 Sets A mathematical set is a well-defined collection of objects A for which we can determine precisely whether or not any object belongs to A. Objects in a set are formally called elements of

More information

UNIT 5 ~ Probability: What Are the Chances? 1

UNIT 5 ~ Probability: What Are the Chances? 1 UNIT 5 ~ Probability: What Are the Chances? 1 6.1: Simulation Simulation: The of chance behavior, based on a that accurately reflects the phenomenon under consideration. (ex 1) Suppose we are interested

More information

CS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning

CS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning CS 446 Machine Learning Fall 206 Nov 0, 206 Bayesian Learning Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes Overview Bayesian Learning Naive Bayes Logistic Regression Bayesian Learning So far, we

More information

Math 140 Introductory Statistics

Math 140 Introductory Statistics Math 140 Introductory Statistics 5.1 Models of random behavior Outcome: Result or answer obtained from a chance process. Event: Collection of outcomes. Probability: Number between 0 and 1 (0% and 100%).

More information

a. The sample space consists of all pairs of outcomes:

a. The sample space consists of all pairs of outcomes: Econ 250 Winter 2009 Assignment 1 Due at Midterm February 11, 2009 There are 9 questions with each one worth 10 marks. 1. The time (in seconds) that a random sample of employees took to complete a task

More information

Probability (Devore Chapter Two)

Probability (Devore Chapter Two) Probability (Devore Chapter Two) 1016-345-01: Probability and Statistics for Engineers Fall 2012 Contents 0 Administrata 2 0.1 Outline....................................... 3 1 Axiomatic Probability 3

More information

Where are we? è Five major sec3ons of this course

Where are we? è Five major sec3ons of this course UVA CS 4501-001 / 6501 007 Introduc8on to Machine Learning and Data Mining Lecture 12: Probability and Sta3s3cs Review Yanjun Qi / Jane University of Virginia Department of Computer Science 10/02/14 1

More information

CMPSCI 240: Reasoning about Uncertainty

CMPSCI 240: Reasoning about Uncertainty CMPSCI 240: Reasoning about Uncertainty Lecture 4: Sequential experiments Andrew McGregor University of Massachusetts Last Compiled: February 2, 2017 Outline 1 Recap 2 Sequential Experiments 3 Total Probability

More information

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35%

9/6. Grades. Exam. Card Game. Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35% 9/6 Wednesday, September 06, 2017 Grades Homework/quizzes: 15% (two lowest scores dropped) Midterms: 25% each Final Exam: 35% Exam Midterm exam 1 Wednesday, October 18 8:50AM - 9:40AM Midterm exam 2 Final

More information

3.2 Probability Rules

3.2 Probability Rules 3.2 Probability Rules The idea of probability rests on the fact that chance behavior is predictable in the long run. In the last section, we used simulation to imitate chance behavior. Do we always need

More information

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events

More information

ECE353: Probability and Random Processes. Lecture 3 - Independence and Sequential Experiments

ECE353: Probability and Random Processes. Lecture 3 - Independence and Sequential Experiments ECE353: Probability and Random Processes Lecture 3 - Independence and Sequential Experiments Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu

More information

1 Probabilities. 1.1 Basics 1 PROBABILITIES

1 Probabilities. 1.1 Basics 1 PROBABILITIES 1 PROBABILITIES 1 Probabilities Probability is a tricky word usually meaning the likelyhood of something occuring or how frequent something is. Obviously, if something happens frequently, then its probability

More information

Linear Classifiers and the Perceptron

Linear Classifiers and the Perceptron Linear Classifiers and the Perceptron William Cohen February 4, 2008 1 Linear classifiers Let s assume that every instance is an n-dimensional vector of real numbers x R n, and there are only two possible

More information

Fundamentals of Probability CE 311S

Fundamentals of Probability CE 311S Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 6 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 6 Notes Goals: CL Sections 3, 4; FN Section

More information

Lecture 3 : Probability II. Jonathan Marchini

Lecture 3 : Probability II. Jonathan Marchini Lecture 3 : Probability II Jonathan Marchini Puzzle 1 Pick any two types of card that can occur in a normal pack of shuffled playing cards e.g. Queen and 6. What do you think is the probability that somewhere

More information

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1 Bayes Networks CS540 Bryan R Gibson University of Wisconsin-Madison Slides adapted from those used by Prof. Jerry Zhu, CS540-1 1 / 59 Outline Joint Probability: great for inference, terrible to obtain

More information

Math Exam 1 Review. NOTE: For reviews of the other sections on Exam 1, refer to the first page of WIR #1 and #2.

Math Exam 1 Review. NOTE: For reviews of the other sections on Exam 1, refer to the first page of WIR #1 and #2. Math 166 Fall 2008 c Heather Ramsey Page 1 Math 166 - Exam 1 Review NOTE: For reviews of the other sections on Exam 1, refer to the first page of WIR #1 and #2. Section 1.5 - Rules for Probability Elementary

More information

Exam 1 Review With Solutions Instructor: Brian Powers

Exam 1 Review With Solutions Instructor: Brian Powers Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?

More information

Probability and Samples. Sampling. Point Estimates

Probability and Samples. Sampling. Point Estimates Probability and Samples Sampling We want the results from our sample to be true for the population and not just the sample But our sample may or may not be representative of the population Sampling error

More information

4452 Mathematical Modeling Lecture 14: Discrete and Continuous Probability

4452 Mathematical Modeling Lecture 14: Discrete and Continuous Probability Math Modeling Lecture 14: iscrete and Continuous Page 1 4452 Mathematical Modeling Lecture 14: iscrete and Continuous Introduction If you have taken mathematical statistics, then you have seen all this

More information

Information Science 2

Information Science 2 Information Science 2 Probability Theory: An Overview Week 12 College of Information Science and Engineering Ritsumeikan University Agenda Terms and concepts from Week 11 Basic concepts of probability

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Hidden Markov Models Part 1: Introduction

Hidden Markov Models Part 1: Introduction Hidden Markov Models Part 1: Introduction CSE 6363 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 Modeling Sequential Data Suppose that

More information

Computer Vision. Pa0ern Recogni4on Concepts Part I. Luis F. Teixeira MAP- i 2012/13

Computer Vision. Pa0ern Recogni4on Concepts Part I. Luis F. Teixeira MAP- i 2012/13 Computer Vision Pa0ern Recogni4on Concepts Part I Luis F. Teixeira MAP- i 2012/13 What is it? Pa0ern Recogni4on Many defini4ons in the literature The assignment of a physical object or event to one of

More information

Chapter 2 Class Notes

Chapter 2 Class Notes Chapter 2 Class Notes Probability can be thought of in many ways, for example as a relative frequency of a long series of trials (e.g. flips of a coin or die) Another approach is to let an expert (such

More information

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum

Graphical Models. Lecture 10: Variable Elimina:on, con:nued. Andrew McCallum Graphical Models Lecture 10: Variable Elimina:on, con:nued Andrew McCallum mccallum@cs.umass.edu Thanks to Noah Smith and Carlos Guestrin for some slide materials. 1 Last Time Probabilis:c inference is

More information

Week 2: Counting with sets; The Principle of Inclusion and Exclusion (PIE) 13 & 15 September 2017

Week 2: Counting with sets; The Principle of Inclusion and Exclusion (PIE) 13 & 15 September 2017 (1/25) MA204/MA284 : Discrete Mathematics Week 2: Counting with sets; The Principle of Inclusion and Exclusion (PIE) Dr Niall Madden 13 & 15 September 2017 A B A B C Tutorials (2/25) Tutorials will start

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

Probability, Conditional Probability and Bayes Rule IE231 - Lecture Notes 3 Mar 6, 2018

Probability, Conditional Probability and Bayes Rule IE231 - Lecture Notes 3 Mar 6, 2018 Probability, Conditional Probability and Bayes Rule IE31 - Lecture Notes 3 Mar 6, 018 #Introduction Let s recall some probability concepts. Probability is the quantification of uncertainty. For instance

More information

7.5: Conditional Probability and Independent Events

7.5: Conditional Probability and Independent Events c Dr Oksana Shatalov, Spring 2012 1 7.5: Conditional Probability and Independent Events EXAMPLE 1. Two cards are drawn from a deck of 52 without replacement. (a) What is the probability of that the first

More information

Review Counting Principles Theorems Examples. John Venn. Arthur Berg Counting Rules 2/ 21

Review Counting Principles Theorems Examples. John Venn. Arthur Berg Counting Rules 2/ 21 Counting Rules John Venn Arthur Berg Counting Rules 2/ 21 Augustus De Morgan Arthur Berg Counting Rules 3/ 21 Algebraic Laws Let S be a sample space and A, B, C be three events in S. Commutative Laws:

More information

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences

More information

Remember that all physical quantities are measured in units. The unit of force is called the newton (N), where 1 N = (1 kg)(1 m/s 2 ).

Remember that all physical quantities are measured in units. The unit of force is called the newton (N), where 1 N = (1 kg)(1 m/s 2 ). Force as an Interaction 1.1 Observe and Represent a) Pick up a tennis ball and hold it in your hand. Now pick up a bowling ball and hold it. Do you feel the difference? How can you describe what you feel

More information

CSE 473: Ar+ficial Intelligence. Probability Recap. Markov Models - II. Condi+onal probability. Product rule. Chain rule.

CSE 473: Ar+ficial Intelligence. Probability Recap. Markov Models - II. Condi+onal probability. Product rule. Chain rule. CSE 473: Ar+ficial Intelligence Markov Models - II Daniel S. Weld - - - University of Washington [Most slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188

More information

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL: Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 16 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 8 (1) This week: Days

More information