Sum and Product Rules

Similar documents
Midterm #1 - Solutions

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Math 3338: Probability (Fall 2006)

Lecture notes for probability. Math 124

Conditional Probability and Independence

Chapter 6: Probability The Study of Randomness

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Example. What is the sample space for flipping a fair coin? Rolling a 6-sided die? Find the event E where E = {x x has exactly one head}

Dynamic Programming Lecture #4

Week 2. Section Texas A& M University. Department of Mathematics Texas A& M University, College Station 22 January-24 January 2019

Probability: Part 1 Naima Hammoud

Properties of Probability

Conditional Probability

Recap. The study of randomness and uncertainty Chances, odds, likelihood, expected, probably, on average,... PROBABILITY INFERENTIAL STATISTICS

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Chapter 1: Introduction to Probability Theory

Randomized Algorithms. Andreas Klappenecker

1 The Basic Counting Principles

AMS7: WEEK 2. CLASS 2

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

MA : Introductory Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Probability, Conditional Probability and Bayes Rule IE231 - Lecture Notes 3 Mar 6, 2018

MAT Mathematics in Today's World

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Year 10 Mathematics Probability Practice Test 1

Week 12-13: Discrete Probability

ECE 302: Chapter 02 Probability Model

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Chapter 5 : Probability. Exercise Sheet. SHilal. 1 P a g e

Chap 4 Probability p227 The probability of any outcome in a random phenomenon is the proportion of times the outcome would occur in a long series of

Section 13.3 Probability

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Statistics 1 - Lecture Notes Chapter 1

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

Lecture Lecture 5

Probability and Sample space

Probability and Independence Terri Bittner, Ph.D.

I - Probability. What is Probability? the chance of an event occuring. 1classical probability. 2empirical probability. 3subjective probability

Chap 1: Experiments, Models, and Probabilities. Random Processes. Chap 1 : Experiments, Models, and Probabilities

Basics of probability

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

13-5 Probabilities of Independent and Dependent Events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

RVs and their probability distributions

Deep Learning for Computer Vision

Math C067 Independence and Repeated Trials.

Recitation 2: Probability

Chapter 7 Wednesday, May 26th

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

Introduction to Probability

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Homework 2. Spring 2019 (Due Thursday February 7)

HW Solution 2 Due: July 10:39AM

Previous Exam Questions, Chapter 2

Math 1313 Experiments, Events and Sample Spaces

Probability Theory and Applications

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 3: Solutions Fall 2007

Chapter 1 Probability Theory

Random Models. Tusheng Zhang. February 14, 2013

Tutorial 3: Random Processes

1 Combinatorial Analysis

Stats Probability Theory

Chapter 2: Probability Part 1

Econ 325: Introduction to Empirical Economics

2.4 Conditional Probability

Intermediate Math Circles November 8, 2017 Probability II

CS 361: Probability & Statistics

1.6/1.7 - Conditional Probability and Bayes Theorem

ORF 245 Fundamentals of Statistics Chapter 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

LECTURE NOTES by DR. J.S.V.R. KRISHNA PRASAD

Statistical Theory 1

CS 630 Basic Probability and Information Theory. Tim Campbell

2.6 Tools for Counting sample points

Two events are mutually exclusive (disjoint) if they cannot occur at the same time. (OR)

Outline Conditional Probability The Law of Total Probability and Bayes Theorem Independent Events. Week 4 Classical Probability, Part II

Name: Exam 2 Solutions. March 13, 2017

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 1. Chapter 1. (Part I) Material Covered in This Lecture: Chapter 1, Chapter 2 ( ). 1. What is Statistics?

4. Probability of an event A for equally likely outcomes:

(a) Fill in the missing probabilities in the table. (b) Calculate P(F G). (c) Calculate P(E c ). (d) Is this a uniform sample space?

Probability (Devore Chapter Two)

Shannon s noisy-channel theorem

Conditional Probabilities

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?

Lecture 1 Introduction to Probability and Set Theory Text: A Course in Probability by Weiss

Independence 1 2 P(H) = 1 4. On the other hand = P(F ) =

Math , Fall 2012: HW 5 Solutions

1. Discrete Distributions

Discrete Probability. Chemistry & Physics. Medicine

Probability- describes the pattern of chance outcomes

HW Solution 3 Due: July 15

= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later.

lecture notes October 22, Probability Theory

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Transcription:

Sum and Product Rules Exercise. Consider tossing a coin five times. What is the probability of getting the same result on the first two tosses or the last two tosses? Solution. Let E be the event that the first two tosses are the same and F be the event that the last two tosses are the same. Let n(e) be the number of ways event E can occur. n(e) n(f ) n(s) Q. Is there any overlap in E and F? I.e., is E \ F ;? n(e and F ) n(e \ F ) 1

Now we can calculate the probability P (E or F ): P (E or F ) Theorem (The Sum Rule) If E and F are events in an experiment then the probability that E or F occurs is given by: P (E or F )P (E)+P (F ) P (E and F ) Example. What is the probability when a pair of dice are rolled that at least one die shows a 5 or the dice sum to 8? Solution. 2

Exercise. Given a bag of 3 red marbles, 5 black marbles and 8 green marbles select one marble and then a second. What is the probability that both are red? Solution. Q. What is the probability of the first marble being red? Q. What is the probability of the second marble being red? Q. Probability of both marbles being red? When the probability of an event E depends on a previous event F happening we denote this probability as P (E F ). 3

Theorem (The Product Rule). If E and F are two events in an experiment then the probability that both E and F occur is: ( ) P (E and F )P (F ) P (E F )P (E) P (F E) Q. What does it mean if P (E F )P (E)? Q. Therefore, if E and F are two independent events, what is ( )? Example. Suppose there is a noisy communication channel in which either a 0 or a 1 is sent with the following probabilities: Probability a 0 is sent is 0.4. Probability a 1 is sent is 0.6. Probability that due to noise, a 0 is changed to a 1 during transmission is 0.2. Probability that due to noise, a 1 is changed to a 0 during transmission is 0.1. Suppose that a 1 is received. What is the probability that a 1 was sent? 4

Let A denote that a 1 was received and B denote the event that a 1 was sent. Q. What is the probability that we are solving for? How can we solve for this? The Product Rule says that: Therefore: P (A and B) P (B) P (A B) P (A)P (B A) P (B A) P (B)P (A B) P (A) Q. What is P (B)? Q. What is P (A B)? Q. What is P (A)? 5

Observation. There are only two possible events that result in a 1 being received. Either a 1 is sent and received or a 0 is sent and due to noise a 1 is received. The probability that A happens is the sum: P (A) P (0 is sent) P (A 0 is sent)+p (1 is sent)p (A 1 is sent) Therefore P (A) Now we can compute P (B A): P (B A) We used two important relationships in the example. The first, is Bayes Rule. Theorem (Bayes Rule). Let A and B be events in the same sample space. If neither P (A) nor P (B) are zero, then: P (B A) P (A B) P (B) P (A) 6

The second is the concept of total probability: Theorem (Total Probability). Let a sample space S be a disjoint union of events E 1,E 2,...,E n with positive probabilities, and let A S. Then: nx P (A) P (A E i ) P (E i ) i1 Probability In Complexity Analysis When we talk about how good an algorithm is, we often use the terms worst case complexity. This means, the number of steps the algorithm takes for the worst possible input. Sometimes we care more about the average or expected number of steps. Example. Searching a list L of integers. In the following list: L 3, 5, 23, 6, 4, 1, 7, 10, 26, 8, 9, 11, 15 if we search by visiting each integer from left to right we will perform 6 visits if we are looking for the number 1 and only 2 visits if we are searching for 5. Q. How many integers do we visit in the worst case? 7

Q. If we assume that we are searching for an integer k in the list and that all positions of the list are equally likely to hold the integer we are looking for, what is the probability of finding the integer in the i th position of a list of length n? Q. How many steps does it take to find the integer if it is in the i th position? We compute the expected E(n) number of steps of the algorithm by computing for each of the i positions: P (k in the i th position) N(i) and then taking the sum over all n possible values for i. E(n) nx P (k in the i th position) N(i) i1 Q. Is this what you expected? 8