CS626 Data Analysis and Simulation

Similar documents
Statistical Inference

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

Probabilistic models

Probabilistic models

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Lecture 3 Probability Basics

Statistical Theory 1

the time it takes until a radioactive substance undergoes a decay

STAT:5100 (22S:193) Statistical Inference I

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

4. Probability of an event A for equally likely outcomes:

With Question/Answer Animations. Chapter 7

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

Lecture 3 - Axioms of Probability

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

BASICS OF PROBABILITY CHAPTER-1 CS6015-LINEAR ALGEBRA AND RANDOM PROCESSES

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Axioms of Probability

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Conditional probability

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Econ 325: Introduction to Empirical Economics

Probability Notes (A) , Fall 2010

CSC Discrete Math I, Spring Discrete Probability

2011 Pearson Education, Inc

Origins of Probability Theory

Tutorial 3: Random Processes

Introduction and basic definitions

Introduction to Probability Theory

Probability, Random Processes and Inference

Presentation on Theo e ry r y o f P r P o r bab a il i i l t i y

Compound Events. The event E = E c (the complement of E) is the event consisting of those outcomes which are not in E.

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Lectures Conditional Probability and Independence

Announcements. Topics: To Do:

Conditional Probability

Dept. of Linguistics, Indiana University Fall 2015

Chapter Learning Objectives. Random Experiments Dfiii Definition: Dfiii Definition:

EE 178 Lecture Notes 0 Course Introduction. About EE178. About Probability. Course Goals. Course Topics. Lecture Notes EE 178

(6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)

Probability (Devore Chapter Two)

Event A: at least one tail observed A:

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Determining Probabilities. Product Rule for Ordered Pairs/k-Tuples:

Module 1. Probability

Probability COMP 245 STATISTICS. Dr N A Heard. 1 Sample Spaces and Events Sample Spaces Events Combinations of Events...

Probability: Sets, Sample Spaces, Events

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

CIVL 7012/8012. Basic Laws and Axioms of Probability

Fundamentals of Probability CE 311S

Axiomatic Foundations of Probability. Definition: Probability Function

Basic Statistics and Probability Chapter 3: Probability

Conditional Probability

Chapter 6: Probability The Study of Randomness

F71SM STATISTICAL METHODS

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Chapter 1 (Basic Probability)

Chapter 2 Class Notes

Review of Basic Probability

Recitation 2: Probability

Topic 3: Introduction to Probability

The enumeration of all possible outcomes of an experiment is called the sample space, denoted S. E.g.: S={head, tail}

Denker FALL Probability- Assignment 6

MATH 556: PROBABILITY PRIMER

Introduction to Probability

Discrete Probability. Mark Huiskes, LIACS Probability and Statistics, Mark Huiskes, LIACS, Lecture 2

MA : Introductory Probability

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Probability 1 (MATH 11300) lecture slides

2. AXIOMATIC PROBABILITY

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Conditional Probability. CS231 Dianna Xu

2. Conditional Probability

7.1 What is it and why should we care?

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

EnM Probability and Random Processes

Probability- describes the pattern of chance outcomes

STT When trying to evaluate the likelihood of random events we are using following wording.

Mutually Exclusive Events

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Probability and Statistics Notes

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Statistics 251: Statistical Methods

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Deep Learning for Computer Vision

Single Maths B: Introduction to Probability

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Chapter 2: Probability Part 1

Probability: Axioms, Properties, Interpretations

Probability the chance that an uncertain event will occur (always between 0 and 1)

1 Preliminaries Sample Space and Events Interpretation of Probability... 13

Chapter 2 Random Variables

Lecture notes for probability. Math 124

EE126: Probability and Random Processes

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

Transcription:

CS626 Data Analysis and Simulation Instructor: Peter Kemper R 104A, phone 221-3462, email:kemper@cs.wm.edu Today: Probability Primer Quick Reference: Sheldon Ross: Introduction to Probability Models 9th Edition, AP, Ch. 1, Berthold, Hand: Intelligent Data Analysis, Springer 99, Chapter 2 by Feelders, Statistics Concepts. 1

Today Sample space and events Probabilities defined on events Kolmogorov s Axioms Conditional probabilities Independent events Excursion on Reliability of Series-Parallel Systems Bayes Formula 2

Today s topics what it is good for Probabilities are introduced in an axiomatic manner: Helps to achieve a sound theory Helps to clarify What assumptions are necessary to make theory apply What needs to be determined to be able to obtain results Clarification of terminology is necessary to Be able to be precise Avoid misunderstandings based on ambiguity of our language Conditional Probability and Bayes Formula are fundamental for many applications, basis of statistical method (Bayesian procedure). 3

Experiment and Sample Space Definition: (Random) Experiment Procedure that has a number of possible outcomes and it is not certain which one will occur Definition: Sample Space The set of all possible outcomes of an experiment is called sample space (denoted by S). Definition A subset E S is called event. Set operations on events: union, intersection 4

Algebra of Events Algebra of events defined by 5 laws, where A, B, C are arbitrary sets (of events) Commutative laws Associative laws Distributive laws Identity laws Complementation laws 5

Some useful relations based on those axioms Idempotent laws Domination laws Absorption laws De Morgan s laws 6

Graphics for Events Venn diagrams S A B Tree diagrams of sequential sample spaces Throw coin twice H T H T H T (H,H) (H,T) (T,H) (T,T) 7

Frequency Definition of Probability If our experiment is repeated over and over again then the proportion of time that event E occurs will just be P(E). Frequency Definition of Probability: P(E) = lim m(e) / m where m(e) is the number of times event E occurs, Note: m m is the number of trials Random experiment can be repeated under identical conditions if repeated indefinitely, relative frequency of occurrence of an event converges to a constant Law of large numbers states that limit does exist. For small m, m(e) can show strong fluctuations. 8

Axiomatic Definition of Probability Definition For each event E of the sample S, we assume that a number P(E) is defined that satisfies Kolmogorov s axioms: 9

Some useful relations derived from axioms What is the probability that E does NOT occur? What is the probability of the impossible event? 10

More relations What is the probability of a UNION of events? What is the probability of a union of a set of events? Is there a better way to calculate this? Sum of disjoint products (SDP) formula 11

Probability space, probability system So far ok for discrete S, but in general we need to be more careful with events E to be able to assign probabilities to events. A probability space is a triple (S,F,P) With sample space S With σ field F of subsets of S to select events from With P being a probability measure defined on F that satisfies Kolmogorov s axioms F is a collection of subsets of S that is closed under countable unions and complementation. Elements of F are called measurable. 12

Outline on Problem Solving (Goodman & Hedetniemi 77) Identify sample space S All elements must be mutually exclusive, collectively exhaustive. All possible outcomes of experiment should be listed separately. (Root of tricky problems: often ambiguity, inexact formulation of the model of a physical situation) Assign probabilities To all elements of S, consistent with Kolmogorov s axioms. (In practice: estimates based on experience, analysis or common assumptions) Identify events of interest Recast statements as subsets of S. Use laws (algebra of events) for simplifications Use visualizations for clarification Compute desired probabilities Use axioms, laws, often helpful: express event of interest as union of mutually exclusive events and sum up probabilities 13

Conditional Probabilities E EF F given F happens EF F Definition The conditional probability of E given F is if P(F) > 0 and it is undefined otherwise. Interpretation: Given F has happened, only events in EF are still possible for E, so original probability P(EF) is scaled by 1/P(F). Multiplication rule: 14

Two examples Family with two children. What is the probability that both children are boys, given that at least one of them is a boy? Given a sample space S where all outcomes are equally likely: Bev can take a computer science course and get an A with 1/2 probability or a chemistry course and get an A with 1/3 probability. If she flips a fair coin to decide what is the probability that Bev will get an A in chemistry? Let C be the event that Bev takes chemistry and A be the event that she receives an A in whatever she takes 15

Using conditional probabilities is trivial? A variation of a classic example: Professional gambler invites you for a game for $50: He has 3 little cups and one little ball, the ball goes under one of the cups and he mixes the cups. You pick a cup. It does not matter if you are right or wrong, the gambler will reveal one of the other cups that has not the little ball (equally likely if you picked the right one and he has a choice of two). It is your choice: To stick with your first guess. To change your mind and switch to the other remaining cup. Then: If you guess the right cup you win $50, Two questions: If you fail you loose $50 to him. What alternative is better according to probability theory? Why do you loose in practice but your neighbor has more luck? 16

Gambling with professionals S = { A, B, C } Initial probabilities, all equal: P(A)=P(B)=P(C)=1/3 Assume you pick A, and C is lifted subsequently as empty P(A) = 1/3, P(A c )=2/3 Now if the one empty cup (say C) is lifted given C is wrong A B C A B So chances for P(B or C) are 2/3 and you can get this with B! Conditional probabilities seem to tell a different story: P(A C c ) = P(AC c ) / P(C c ) = P(A) / P(C c ) = 1/3 / 2/3 = 1/2 P(A c C c ) = P(A c C c ) / P(C c ) = P(B) / P(C c ) = 1/3 / 2/3 = 1/2 What is right? 17

Independent events Definition Two events E and F are independent if: This also means: In English, E and F are independent Notes: if knowledge that F has occurred does not affect the probability that E occurs. if E, F independent then also E,F c and E c,f and E c,f c Generalizes from 2 to n events e.g. n=3 every subset independent Mutually exclusive vs independent 18

Example Tossing two fair dice, let E 1 be the event that the sum of the dice is six and F be the event that the first die is a four Thus E 1 and F are not independent Same experiment, except let E 2 be the event that the sum of the dice is seven Thus E 2 and F are independent 19

Joint and pairwise independence A ball is drawn from an urn containing four balls numbered 1, 2, 3, 4. Then we have: They are pairwise independent, but not jointly independent A sequence of experiments results in either a success or a failure where E i, i >= 1 denotes a success. If for all i 1, i 2,, i n : we say the sequence of experiments consists of independent trials 20

Excursion: Reliability Analysis with Reliability Block Diagrams Reliability of series-parallel systems Motivation: Illustrate how probabilities can be applied Illustrate how powerful independence assumption is We consider a set of components with index i=1,2, Event A i = component i is functioning properly Reliability R i of i is the probability P(A i ) Series system: Entire system fails if any of its components fails Parallel system: Entire system fails if all of its components fail Key assumption: Failures of components are independent. For now. R is a probability, later R will be a function of time t 21

Reliability Analysis (if component failures are independent) Reliability of a series system (Product law of reliabilities) Based on the assumption of series connections. Note how quickly R s degrades for n = 1,2, Reliability of a parallel system Let F i = 1-R i be the unreliability of a component, F p = 1-R p of a parallel system Then (Product law of unreliabilities) Note: also law of diminishing returns (rate of increase in reliability decreases rapidly as n increases) Reliability of a series-parallel system Of n serial stages, at stage i have n i identical components (in parallel) 22

Reliability Block Diagrams Series parallel RBD of a network R1 R2 R3 R3 R3 R4 R4 R5 Other representations: Fault trees Limits: more general dependencies Structure Function Inclusion/exclusion formula (or SDP) Approach with Binary decision diagrams (BDD), Zang 99 (in Trivedi Ch1) Factoring/Conditioning More techniques for more general settings 23

Bayes Formula Let E and F be events, we may express E as: Because EF and EF c are mutually exclusive we can say: In English: Event E is a weighted average of the conditional probability of E given that F has occurred and the conditional probability of E given that F has not occurred. 24

Example: Student solves a multiple choice test. Let: p : probability that he/she knows the answer 1-p: probability that he/she guesses. Assume: guessing has success probability 1/m, where m is the number of multiple choice alternatives. What is the conditional probability that a student knew the answer to the question that he/she answered correctly? Let C: event that student answers correctly Let K: event that student actually knew the answer. Then we have: Known: P(K)=p P(K c )=1-p P(C K c )=1/m P(C K)=1 25

Another example: Laboratory blood test Test: Question: 95% effective in detecting a certain disease when it is existent. 1% error rate of saying that a healthy person has the disease. If 0.5% of the population has the disease, what is the probability that a person has the disease given that the test result is positive? Let D be the event that the tested person has the disease, Let E be the event that the test result is positive. Known: P(E D)=.95 P(E D c )=.01 P(D)=.005 P(D c )=.995 26

Bayes Formula Let F 1, F 2,, F n be events of S, all mutually exclusive and collectively exhaustive. Theorem of total probability (also Rule of Elimination) Bayes Formula helps us to determine which F j happened given we observed E 27

Gambling with professionals revisited S = { A, B, C } Initial probabilities, all equal: P(A)=P(B)=P(C)=1/3 Assume you pick A, and C is lifted subsequently as empty P(A) = 1/3, P(A c )=2/3 Now if the one empty cup (say C) is lifted given C is wrong A B C A B So chances for P(B or C) are 2/3 and you can get this with B! Conditional probabilities seem to tell a different story: P(A C c ) = P(AC c ) / P(C c ) = P(A) / P(C c ) = 1/3 / 2/3 = 1/2 P(A c C c ) = P(A c C c ) / P(C c ) = P(B) / P(C c ) = 1/3 / 2/3 = 1/2 What is right? 28

Gambling with professionals... Bayes Theorem Scenario: you pick cup A, gambler opens cup C Question: Success probability of switching P(B Gc) S = { A, B, C} for Ball is under A, B, or C S = {Ga, Gb, Gc} for Gambler opens A, B, or C Probabilities: P(A)=P(B)=P(C)=1/3 P(Gc A) = 1/2 P(Gc A)P(A)=1/2*1/3=1/6 P(Gc B) = 1 implies P(Gc B)P(B)=1 *1/3=1/3 P(Gc C) = 0 P(Gc C)P(C)=0 *1/3 =0 Bayes Theorem applied: P(B Gc) = P(Gc B) P(B) / X where X = P(Gc A)P(A) + P(Gc B)P(B) + P(Gc C)P(C) such that P(B Gc)= (1 * 1/3) / (1/6 + 1/3 + 0) = 1/3 / 3/6 = 2/3 29

Summary Sample space and events Probabilities defined on events Kolmogorov s Axioms Conditional probabilities Independent events Bayes Formula 30