Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 15 Notes. Class URL:

Similar documents
Probability Distributions. Conditional Probability.

Lower bound for sorting/probability Distributions

Probability Distributions. Conditional Probability

Notes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:

CSC Discrete Math I, Spring Discrete Probability

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Sec$on Summary. Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

With Question/Answer Animations. Chapter 7

Probabilistic models

Bayes Rule for probability

Discrete Structures for Computer Science

Discrete Probability

First Digit Tally Marks Final Count

Probabilistic models

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

EE126: Probability and Random Processes

What is Probability? Probability. Sample Spaces and Events. Simple Event

Events A and B are said to be independent if the occurrence of A does not affect the probability of B.

Chapter 2. Conditional Probability and Independence. 2.1 Conditional Probability

Quantitative Methods for Decision Making

Basic Statistics for SGPE Students Part II: Probability theory 1

2. Conditional Probability

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Introduction to Probability 2017/18 Supplementary Problems

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

CS 361: Probability & Statistics

CS 361: Probability & Statistics

Probability (10A) Young Won Lim 6/12/17

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Chapter Summary. 7.1 Discrete Probability 7.2 Probability Theory 7.3 Bayes Theorem 7.4 Expected value and Variance

Conditional Probability

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

CS 441 Discrete Mathematics for CS Lecture 20. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

Lecture Lecture 5

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

SDS 321: Introduction to Probability and Statistics

Homework 4 Solution, due July 23

ORF 245 Fundamentals of Statistics Chapter 5 Probability

Probability deals with modeling of random phenomena (phenomena or experiments whose outcomes may vary)

Discrete Random Variable

HW1 Solutions. October 5, (20 pts.) Random variables, sample space and events Consider the random experiment of ipping a coin 4 times.

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

1 INFO 2950, 2 4 Feb 10

ORF 245 Fundamentals of Statistics Chapter 1 Probability

Deep Learning for Computer Vision

Conditional probability

PERMUTATIONS, COMBINATIONS AND DISCRETE PROBABILITY

TA Qinru Shi: Based on poll result, the first Programming Boot Camp will be: this Sunday 5 Feb, 7-8pm Gates 114.

Lecture 3. January 7, () Lecture 3 January 7, / 35

27 Binary Arithmetic: An Application to Programming

Expected Value 7/7/2006

Uncertainty. Russell & Norvig Chapter 13.

STAT 430/510 Probability

Chapter 2 PROBABILITY SAMPLE SPACE

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

Great Theoretical Ideas in Computer Science

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Lecture 4: Probability, Proof Techniques, Method of Induction Lecturer: Lale Özkahya

Thus, P(F or L) = P(F) + P(L) - P(F & L) = = 0.553

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Part (A): Review of Probability [Statistics I revision]

General Info. Grading

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

CS206 Review Sheet 3 October 24, 2018

Introduction to Probability and Sample Spaces

Conditional Probability

Lecture 1: Basics of Probability

Probability. VCE Maths Methods - Unit 2 - Probability

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

Some Basic Concepts of Probability and Information Theory: Pt. 1

Introduction to Probability, Fall 2009

CS4705. Probability Review and Naïve Bayes. Slides from Dragomir Radev

4/17/2012. NE ( ) # of ways an event can happen NS ( ) # of events in the sample space

Final Examination. Adrian Georgi Josh Karen Lee Min Nikos Tina. There are 12 problems totaling 150 points. Total time is 170 minutes.

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Lecture 3: Probability

Discrete random variables and probability distributions

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Mean, Median and Mode. Lecture 3 - Axioms of Probability. Where do they come from? Graphically. We start with a set of 21 numbers, Sta102 / BME102

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

STA 2023 EXAM-2 Practice Problems From Chapters 4, 5, & Partly 6. With SOLUTIONS

Lecture 6 - Random Variables and Parameterized Sample Spaces

Conditional Probability P( )

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Lecture 3 - Axioms of Probability

Introduction to Probability Theory, Algebra, and Set Theory

Formalizing Probability. Choosing the Sample Space. Probability Measures

CS 441 Discrete Mathematics for CS Lecture 19. Probabilities. CS 441 Discrete mathematics for CS. Probabilities

n How to represent uncertainty in knowledge? n Which action to choose under uncertainty? q Assume the car does not have a flat tire

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Conditional Probability. CS231 Dianna Xu

Formal Modeling in Cognitive Science

Formal Modeling in Cognitive Science Lecture 19: Application of Bayes Theorem; Discrete Random Variables; Distributions. Background.

2/3/04. Syllabus. Probability Lecture #2. Grading. Probability Theory. Events and Event Spaces. Experiments and Sample Spaces


UNIT 5 ~ Probability: What Are the Chances? 1

Sampling Based Filters

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Transcription:

Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 15 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/

Notes slides from before lecture Notes March 6 (1) This week: Days 21, 22, 23 of posted slides Probability distributions, conditional probability, expected value, Bayes theorem HW7 is due on WEDNESDAY, March 8 MT2 3-day (= 72-hour) window for regrade requests Please ask questions about MT2, grading in OHs. Discussion section this week will spend some time going over MT2. Poll for Final exam reviews: closes tomorrow (Tuesday) 11:59pm Any logistic, other issues?

Probability Distributions. Conditional Probability. CSE21 Winter 2017, Day 21 (B00), Day 15 (A00) March 6, 2017 http://vlsicad.ucsd.edu/courses/cse21-w17

Probability Rosen p. 446, 453 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,.

Probability Rosen p. 446, 453 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,. Compare flipping a fair coin and a biased coin: A. Have different sample spaces. B. Have the same sample spaces but different probability distributions. C. Have the same sample space and same probability distributions.

Probability Rosen p. 446, 453-4 Sample space, S: (finite or countable) set of possible outcomes. Probability distribution, p: assignment of probabilities to outcomes in S so that - 0<= p(s) <=1 for each s in S. - Sum of probabilities is 1,. Event, E: subset of possible outcomes.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times: A. The sample space is {H, T} B. The empty set is not an event. C. The event {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT} has probability less than 1. D. The uniform distribution assigns probability 1/8 to each outcome. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times: A. The sample space is {H, T} B. The empty set is not an event. C. The event {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT} has probability less than 1. D. The uniform distribution assigns probability 1/8 to each outcome. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times, what is the distribution of the number of Hs that appear? A. Uniform distribution. B. P( 0 H ) = P ( 3 H ) = 3/8 and P( 1 H ) =P( 2 H ) = 1/8. C. P( 0 H ) = P (1 H ) = 1/8 and P( 2 H ) = P( 3 H ) = 1/8. D. P( 0 H ) = P (3 H ) = 1/8 and P( 1 H ) = P( 2 H ) = 1/8. E. None of the above.

Uniform distribution Rosen p. 454 For sample space S with n elements, uniform distribution assigns the probability 1/n to each element of S. When flipping a fair coin successively three times, what is the distribution of the number of Hs that appear? A. Uniform distribution. B. P( 0 H ) = P ( 3 H ) = 3/8 and P( 1 H ) =P( 2 H ) = 1/8. C. P( 0 H ) = P (1 H ) = 1/8 and P( 2 H ) = P( 3 H ) = 1/8. D. P( 0 H ) = P (3 H ) = 1/8 and P( 1 H ) = P( 2 H ) = 1/8. E. None of the above.

Probability and counting If start with the uniform distribution on a set S, then the probability of an event E is When flipping n fair coins what is the probability of getting exactly k Hs? A. 1/n B. k/n C. 1/2 n D. C(n,k) / 2 n E. None of the above.

Binomial distribution When flipping n fair coins what is the probability of getting exactly k Hs? Possible coin toss sequences: { HH..HH, HH..HT,, TT..TH, TT..TT }

Binomial distribution When flipping n fair coins what is the probability of getting exactly k Hs? Possible coin toss sequences: { HH..HH, HH..HT,, TT..TH, TT..TT } What if the coin isn't fair?

Binomial distribution Bernoulli trial: a performance of an experiment with two possible outcomes. e.g. flipping a coin Binomial distribution: probability of exactly k successes in n independent Bernoulli trials, when probability of success is p. e.g. # Hs in n coin flips when probability of H is p What is this probability? Rosen p. 480 A. C(n,k) / 2 n B. p k /2 n C. C(n,k) p k D. C(n,k) p k (1-p) n-k E. None of the above.

Randomness in Computer Science When the input is random data mining elections weather stock prices genetic markers analyzing experimental data When is analysis valid? When are we overfitting to available data?

Randomness in Computer Science When the desired output is random - picking a cryptographic key - performing a scientific simulation - programming the adversary in a computer game

Randomness in Computer Science When the algorithm uses randomness - Monte Carlo methods Rosen p. 463 - search heuristics avoid local mins - randomized hashing - quicksort

Dangers of probabilistic reasoning "Intuitive probabilistic reasoning".

Conditional probabilities Probability of an event may change if have additional information about outcomes. A conditional probability of given,, is the probability that occurs knowing that also occurs. Suppose E and F are events, and P(F) > 0. Then, i.e., Rosen p. 456

Conditional probabilities Note that if then we say that the events and are independent. So if and are independent then we have that: Rosen p. 456

Conditional probabilities If you flipped a biased coin two times with, 1. Then the sample space is,,,. What is the probability that you flip a first and then a second? Let be the event that is flipped first. Let be the event that is flipped second. 1 Then the answer to the question above is. But the first flip has no effect on the second flip and so 1 Rosen p. 456

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. A. They're equal. B. They're not equal. C. Not sure. Assume that each child being a boy or a girl is equally likely.

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. 1. Sample space Assume that each child being a boy or a girl is equally likely. 2. Initial distribution on the sample space 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space Uniform distribution, each outcome has probability ¼. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls}

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg }

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg } P(A) = ½ P(B) = ¼ = P(A B) U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? A = { outcomes where oldest is a girl } B = { outcomes where two are girls } = { gg, gb} = { gg } P(A) = ½ P(B) = ¼ = P(A B) U By conditional probability law: P(B A) = P(A B) / P(A) = (1/4) / (1/2) = ½. U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 1. Sample space Possible outcomes: {bb, bg, gb, gg} Order matters! 2. Initial distribution on the sample space Uniform distribution, each outcome has probability ¼. 3. What events are we conditioning on?

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? C = { outcomes where one is a boy} D = { outcomes where two are boys } = { bb, bg, gb } = { bb } P(C) = ¾ P(D) = ¼ = P(C D) U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. Assume that each child being a boy or a girl is equally likely. 3. What events are we conditioning on? C = { outcomes where one is a boy} D = { outcomes where two are boys } = { bb, bg, gb } = { bb } P(C) = ¾ P(D) = ¼ = P(C D) By conditional probability law: P(D C) = P(C D) / P(C) = (1/4) / (3/4) = 1/3. U U

Conditional probabilities Are these probabilities equal? The probability that two siblings are girls if know the oldest is a girl. 1/2 The probability that two siblings are boys if know that one of them is a boy. 1/3 Assume that each child being a boy or a girl is equally likely.

Conditional probabilities: Simpson's Paradox Treatment A Small stones 81 successes / 87 (93%) Large stones 192 successes / 263 (73%) Combined 273 successes / 350 (78%) Treatment B 234 successes / 270 (87%) 55 successes / 80 (69%) 289 successes / 350 (83%) Which treatment is better? A. Treatment A for all cases. C. A for small and B for large. B. Treatment B for all cases. D. A for large and B for small. C. R. Charig, D. R. Webb, S. R. Payne, J. E. Wickham (29 March 1986). "Comparison of treatment of renal calculi by open surgery, percutaneous nephrolithotomy, and extracorporeal shockwave lithotripsy". Br Med J (Clin Res Ed) 292 (6524): 879 882. doi:10.1136/bmj.292.6524.879. PMC 1339981. PMID 3083922. cf. Wikipedia "Simpson's Paradox"

Conditional probabilities: Simpson's Paradox Treatment A Small stones 81 successes / 87 (93%) Large stones 192 successes / 263 (73%) Combined 273 successes / 350 (78%) Treatment B 234 successes / 270 (87%) 55 successes / 80 (69%) 289 successes / 350 (83%) Simpson's Paradox "When the less effective treatment is applied more frequently to easier cases, it can appear to be a more effective treatment." C. R. Charig, D. R. Webb, S. R. Payne, J. E. Wickham (29 March 1986). "Comparison of treatment of renal calculi by open surgery, percutaneous nephrolithotomy, and extracorporeal shockwave lithotripsy". Br Med J (Clin Res Ed) 292 (6524): 879 882. doi:10.1136/bmj.292.6524.879. PMC 1339981. PMID 3083922. cf. Wikipedia "Simpson's Paradox"

The Monty Hall Puzzle Car hidden behind one of three doors. Goats hidden behind the other two. Player gets to choose a door. Host opens another door, reveals a goat. Player can choose whether to swap choice with other closed door or stay with original choice. What's the player's best strategy? A. Always swap. B. Always stay. C. Doesn't matter, it's 50/50.

Some history Puzzle introduced by Steve Selvin in 1975. Marilyn vos Savant was a prodigy with record scores on IQ tests who wrote an advice column. In 1990, a reader asked for the solution to the Monty Hall puzzle. After she published the (correct) answer, thousands of readers (including PhDs and even a professor of statistics) demanded that she correct her "mistake". She built a simulator to demonstrate the solution so they could see for themselves how it worked.

The Monty Hall Puzzle the solution Pick a door at random to start

The Monty Hall Puzzle the solution

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always switch ("Y")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always switch ("Y")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always stay ("N")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's the probability of winning (C) if you always stay ("N")? A. 1/3 B. 1/2 C. 2/3 D. 1 E. None of the above.

The Monty Hall Puzzle the solution What's wrong with the following argument? "It doesn't matter whether you stay or swap because the host opened one door to show a goat, so there are only two doors remaining, and both of them are equally likely to have the car because the prizes were placed behind the doors randomly at the start of the game."

Random Variables A random variable assigns a real number to each possible outcome of an experiment. Rosen p. 460,478 The distribution of a random variable X is the function r P(X = r) The expectation (average, expected value) of random variable X on sample space S is

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with two children. A. 0 B. 1 C. 1.5 D. 2

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with three children. A. 0 B. 1 C. 1.5 D. 2

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected number of boys in a family with three children. A. 0 B. 1 C. 1.5 D. 2 The expected value might not be a possible value of the random variable like 1.5 boys!

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected sum of two 6-sided dice. A. 6 B. 7 C. 8 D. 9 E. None of the above.

Expected Value Examples Rosen p. 460,478 The expectation (average, expected value) of random variable X on sample space S is Calculate the expected sum of two 6-sided dice. A. 6 B. 7 C. 8 D. 9 E. None of the above.

Properties of Expectation Rosen p. 460,478 E(X) may not be an actually possible value of X. But m <= E(X) <= M, where m is the minimum possible value of X and M is the maximum possible value of X.

Useful trick 1: Case analysis Rosen p. 460,478 The expectation can be computed by conditioning on an event and its complement Theorem: For any random variable X and event A, E(X) = P(A) E(X A) + P( A c ) E ( X A c ) where A c is the complement of A. Conditional Expectation

Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? e.g. X(HHT) = 1 X(HHH) = 2.

Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Directly from definition For each of eight possible outcomes, find probability and value of X: HHH (P(HHH)=1/8, X(HHH) = 2), HHT, HTH, HTT, THH, THT, TTH, TTTetc.

Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". Which subset of S is A? A. { HHH } B. { THT } C. { HHT, THH} D. { HHH, HHT, THH, THT} E. None of the above.

Useful trick 1: Case analysis Example: If X is the number of pairs of consecutive Hs when we flip a fair coin three times, what is the expectation of X? Solution: Using conditional expectation Let A be the event "The middle flip is H". E(X) = P(A) E(X A) + P( A c ) E ( X A c )