Dynamic Programming Lecture #4

Similar documents
Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Recitation 2: Probability

Probability: Sets, Sample Spaces, Events

MA : Introductory Probability

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Conditional Probability

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Lecture 1. ABC of Probability

Chapter 2. Probability. Math 371. University of Hawai i at Mānoa. Summer 2011

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

CMPSCI 240: Reasoning about Uncertainty

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Properties of Probability

Introduction to Probability

12 1 = = 1

Statistical Inference

A brief review of basics of probabilities

Lecture 3. January 7, () Lecture 3 January 7, / 35

RVs and their probability distributions

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

Probabilistic Reasoning

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

P [(E and F )] P [F ]

Venn Diagrams; Probability Laws. Notes. Set Operations and Relations. Venn Diagram 2.1. Venn Diagrams; Probability Laws. Notes

Mathematical Foundations of Computer Science Lecture Outline October 9, 2018

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Chapter 1: Introduction to Probability Theory

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

k P (X = k)

STAT:5100 (22S:193) Statistical Inference I

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Lecture Lecture 5

Tutorial 3: Random Processes

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Probabilistic models

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Conditional Probability

Introduction to discrete probability. The rules Sample space (finite except for one example)

Statistical Theory 1

4. Probability of an event A for equally likely outcomes:

Lecture 3 - Axioms of Probability

Probability Basics Review

Randomized Algorithms

CS70: Jean Walrand: Lecture 16.

Chapter 3 : Conditional Probability and Independence

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Lecture 6 - Random Variables and Parameterized Sample Spaces

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Chapter 7 Wednesday, May 26th

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Basics of probability

ECE 302: Chapter 02 Probability Model

Elementary Discrete Probability

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

Conditional Probability and Bayes

Review of probabilities

Math 3338: Probability (Fall 2006)

Lecture 8: Probability

Sample Space: Specify all possible outcomes from an experiment. Event: Specify a particular outcome or combination of outcomes.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

STAT 430/510 Probability

Lecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Lecture 2: Probability, conditional probability, and independence

Math 1313 Experiments, Events and Sample Spaces

Solution to HW 12. Since B and B 2 form a partition, we have P (A) = P (A B 1 )P (B 1 ) + P (A B 2 )P (B 2 ). Using P (A) = 21.

Midterm #1 - Solutions

Chapter 2: Probability Part 1

Topic 5 Basics of Probability

Chapter 6: Probability The Study of Randomness

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Fundamentals of Probability CE 311S

Probability Notes (A) , Fall 2010

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov.

Axioms of Probability

Introduction to Probability Theory, Algebra, and Set Theory

Sum and Product Rules

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Conditional Probability

Econ 325: Introduction to Empirical Economics

Probability & Random Variables

Probability Theory for Machine Learning. Chris Cremer September 2015

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Randomized Algorithms. Andreas Klappenecker

Lecture 2. October 21, Department of Biostatistics Johns Hopkins Bloomberg School of Public Health Johns Hopkins University.

1 What are probabilities? 2 Sample Spaces. 3 Events and probability spaces

Dept. of Linguistics, Indiana University Fall 2015

the time it takes until a radioactive substance undergoes a decay

SDS 321: Introduction to Probability and Statistics

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

1. Discrete Distributions

Transcription:

Dynamic Programming Lecture #4 Outline: Probability Review Probability space Conditional probability Total probability Bayes rule Independent events Conditional independence Mutual independence

Probability Space Sample Space: A set Ω Event: A subset of Ω To any subset of A Ω, we denote Axioms: 1. P [A] 0 2. P [Ω] = 1 3. For disjoint sets A 1, A 2, A 3,... P [A] = the probabilty of A P [ A 1 A2 A3... ] = P [A1 ] + P [A 2 ] + P [A 3 ] +... We will restrict our attention to countable probability spaces: Ω = {ω 1, ω 2, ω 3,...} P [ω i ] = p i P [A] = p i ω i A

Example: Pair of Dice Ω = {(i, j) : 1 i 6 & 1 j 6} Visualization: i\j 1 2 3 4 5 6 1 2 3 4 5 6 Set P [(i, j)] = p ij For fair dice, p ij = 1/36 for all rolls, i.e., (i, j) pairs. To compute probabilities, must translate statements into events (i.e., subsets): Doubles: p 11 +... + p 66 Larger die = 3: p 13 + p 23 + p 33 + p 32 + p 31 Sum of dice = 4: p 13 + p 22 + p 31 Same events regardless of p ij values only resulting probabilities differ.

Conditional Probability Motivation: Compare probability of an event versus probability of the same event GIVEN additional information. Example: (Probability sum of dice 7) (Probability sum of dice 7) given (one of dice 5) Example: Probability car needs repair given engine light is on? Conditional Probability: Let A and B be events. Define probability of A given B : P [A B] = P [A B] P [B] Note that A B is another event, interpreted as both A and B. Ω A A&B B given B effectively redefines the sample space of events. Extreme examples: A B? B A? A B =?

Example: Dice What is probability (sum of dice 4) given (larger die = 3)? Translate to events: A : (sum of dice 4) = {(1, 1), (1, 2), (1, 3), (2, 1), (2, 2), (3, 1)} i\j 1 2 3 4 5 6 1 X X X 2 X X 3 X 4 5 6 B : (larger die = 3) = {(1, 3), (2, 3), (3, 3), (3, 2), (3, 1)} A B = {(1, 3), (3, 1)} Result: P [A B] = i\j 1 2 3 4 5 6 1 X 2 X 3 X X X 4 5 6 p 13 + p 31 p 13 + p 23 + p 33 + p 32 + p 31 Neither the definition nor computation relies on natural probabilities of dice.

Total Probability Suppose A 1 and A 2 satisfy: A 1 A2 = A 1 A2 = Ω i.e., A 1 and A 2 form a partition of Ω. For any event B: B = (B A 1 ) (B A 2 ) From axioms: P [B] = P [ B A 1 ] + P [ B A2 ] Using conditional probability: P [B] = P [B A 1 ] P [A 1 ] + P [B A 2 ] P [A 2 ] More generally, given a mutually exclusive A 1,..., A N partition of Ω: P [B] = P [B A 1 ] P [A 1 ] +... + P [B A N ] P [A N ] Ω Α2 Α1 B Α3

Bayes Rule Recall Total Probability: Given a mutually exclusive A 1,..., A N partition of Ω: P [B] = P [B A 1 ] P [A 1 ] +... + P [B A N ] P [A N ] For any A i : same RHS numerator P [A i B] = P [A i & B] P [B] P [B A i ] = P [A i & B] P [A i ] P [A i B] P [B] = P [B A i ] P [A i ] Now rewrite, using total probability: Known as Bayes rule P [A i B] = Utility: Hypothesis revision A i : hypotheses B: new data P [B A i ]: anticipated data given hypothesis P [B A i ] P [A i ] P [B A 1 ] P [A 1 ] +... + P [B A N ] P [A N ] P [A i ]: before-data (prior) belief/confidence P [A i B]: after-data (posterior) belief/confidence

Example: Cheater Detection Two coins: Coin 1: P [H] = p 1, P [T ] = (1 p 1 ) Coin 2: P [H] = p 2, P [T ] = (1 p 2 ) After seeing data, which coin is being used? Never know for sure, but can compute probabilities. Associate: A 1 : using coin 1 A 2 : using coin 2 B: observed data Suppose we see 2 flips of coin: HH P [HH A 1 ] = p 2 1 P [HH A 2 ] = p 2 2 Similar with other flips, HT, T H, T T. Now apply Bayes rule: P [A i B] = P [B A i ] P [A i ] P [B A 1 ] P [A 1 ] + P [B A 2 ] P [A 2 ] Numerical example: P [A 1 ] = 0.3, P [A 2 ] = 0.7, p 1 = 0.9, p 2 =.5, B = HH: P [A 1 HH] = (0.9) 2 (0.3) (0.9 2 )(0.3) + (0.5) 2 (0.7) = 0.58 (0.5) 2 (0.7) P [A 2 HH] = (0.9 2 )(0.3) + (0.5) 2 (0.7) = 0.42 i.e., increased suspicion of cheating!

Independent Events Define: Events A and B are Independent if In terms of conditional probability: P [ A B ] = P [A] P [B] P [A B] = P [A B] P [B] = P [A] P [B] P [B] = P [A] (if independent!) Information of B does not affect probability of A. Example: Fair dice A = doubles B = sum of roll 3 P [A B] = 1/36 = P [A] P [B] = (1/6)(3/36)? Not independent. Changing B to event (first die = 3) results in independence. Independence depends on events A & B and underlying probabilities.

Conditional Independence Conditional independence: Given an event C, the events A and B are conditionally independent given C if P [ A B C ] = P [A C] P [B C] Rewrite: P [ A B C ] = P [A B C] P [C] = P [A (B C)] P [B C] P [C] (under conditional independence) P [A C] = P [ A (B C) ] i.e., additional knowledge of B does not affect probability of A Conditioning may change independence. Example: Two coins. Probability of heads for each coin p A and p B Randomly choose coin (A, B) with probability (q, 1 q) Toss selected coin 2 times Events: E 1 = heads on first toss E 2 = heads on second toss E 0 = choose coin A Question: Are E 1 and E 2 independent? No. P [E 1 &E 2 ] = P [E 1 ] P [E 2 ]? P [E 1 &E 2 E 0 ] P [E 0 ] + P [E 1 &E 2 E c 0] P [E c 0] = p 2 Aq + p 2 B(1 q) = ( p A q + p B (1 q) ) ( p A q + p B (1 q) )? Are E 1 and E 2 conditionally independent on E 0? Yes. P [E 1 &E 2 E 0 ] = p 2 A = P [E 1 E 0 ] P [E 2 E 0 ] = P [ A (B C) ] P [B C]

Mutual Independence Mutual independence: A set of events A 1,..., A N are mutually independent if for any sub-collection, S P A i i S = Π i S P [A i ] Pairwise independence does not imply mutual independence. Example: 2 fair dice A : doubles; B : first die = 6; C : second die = 1. i\j 1 2 3 4 5 6 1 (A, C) 2 C A 3 C A 4 C A 5 C A 6 (B, C) B B B B (A, B) A&B are independent, A&C are independent, B&C are independent. P [A & B & C] = 0 P [A] P [B] P [C]