Conditional Probability: Review
|
|
- Beryl Park
- 5 years ago
- Views:
Transcription
1 CS7: Jean Walrand: Lecture 7 ayes Rule, Mutual Independence, Collisions and Collecting Conditional Probability 2 Independence 3 ayes Rule 4 alls and ins 5 Coupons Conditional Probability: Review Recall: Pr[ ] = Pr[ ] Pr[] Pr[ ] = Pr[]Pr[ ] = Pr[]Pr[ ] and are positively correlated if Pr[ ] > Pr[], ie, if Pr[ ] > Pr[]Pr[] and are negatively correlated if Pr[ ] < Pr[], ie, if Pr[ ] < Pr[]Pr[] and are independent if Pr[ ] = Pr[], ie, if Pr[ ] = Pr[]Pr[] Note: and are positively correlated (Pr[ ] = > Pr[]) Note: = / and are negatively correlated (Pr[ ] = < Pr[]) Conditional Probability: Pictures Illustrations: Pick a point uniformly in the unit square b b2 b b b2 Left: and are independent Pr[] = b;pr[ ] = b Middle: and are positively correlated Pr[ ] = b > Pr[ Ā] = b 2 Note: Pr[] (b 2,b ) Right: and are negatively correlated Pr[ ] = b < Pr[ Ā] = b 2 Note: Pr[] (b,b 2 ) ayes and iased Coin ayes: General Case ayes Rule nother picture: Pick a point uniformly at random in the unit square Then Pr[] = 5;Pr[Ā] = 5 Pr[ ] = 5;Pr[ Ā] = 6;Pr[ ] = 5 5 Pr[] = = Pr[]Pr[ ] + Pr[Ā]Pr[ Ā] 5 5 Pr[ ] = = Pr[]Pr[ ] Pr[]Pr[ ] + Pr[Ā]Pr[ Ā] 46 = fraction of that is inside Pick a point uniformly at random in the unit square Then Pr[ m ] = p m,m =,,M Pr[ m ] = q m,m =,,M;Pr[ m ] = p m q m Pr[] = p q + p M q M p m q m Pr[ m ] = = fraction of inside m p q + p M q M Pr[ n ] = p nq n m p m q m
2 Why do you have a fever? Why do you have a fever? Our ayes Square picture: 5 8 Flu Ebola Why do you have a fever? We found Pr[Flu High Fever] 58, Pr[Ebola High Fever] 5 8, Pr[Other High Fever] 42 Using ayes rule, we find 5 8 Pr[Flu High Fever] = Pr[Ebola High Fever] = Pr[Other High Fever] = The values 58,5 8,42 are the posterior probabilities Other 85 Green = Fever 58% of Fever = Flu % of Fever = Ebola 42% of Fever = Other Note that even though Pr[Fever Ebola] =, one has Pr[Ebola Fever] This example shows the importance of the prior probabilities One says that Flu is the Most Likely a Posteriori (MP) cause of the high fever Ebola is the Maximum Likelihood Estimate (MLE) of the cause: it causes the fever with the largest probability Recall that Thus, p mq m p m = Pr[ m],q m = Pr[ m],pr[ m ] = p q + + p M q M MP = value of m that maximizes p mq m MLE = value of m that maximizes q m ayes Rule Operations Thomas ayes Thomas ayes ayes Rule is the canonical example of how information changes our opinions ayesian picture of Thomas ayes Source: Wikipedia
3 Independence Recall : Consider the example below: and are independent 2 3 Pr[ ] = Pr[]Pr[] Pr[ ] = Pr[] 25 5 ( 2,) are independent: Pr[ 2 ] = 5 = Pr[ 2 ] ( 2, ) are independent: Pr[ 2 ] = 5 = Pr[ 2 ] (,) are not independent: Pr[ ] = 5 = 2 Pr[ ] = 25 Mutual Independence 5 25 Pairwise Independence Flip two fair coins Let = first coin is H = {HT,HH}; = second coin is H = {TH,HH}; C = the two coins are different = {TH,HT }, C are independent;, C are independent;,c are not independent (Pr[ C] = Pr[ ]Pr[C]) If did not say anything about C and did not say anything about C, then would not say anything about C Mutual Independence Example 2 Flip a fair coin 5 times Let n = coin n is H, for n =,,5 Then, m, n are independent for all m n lso, and 3 5 are independent Indeed, Pr[ ( 3 5 )] = 8 = Pr[ ]Pr[ 3 5 ] Similarly, 2 and are independent This leads to a definition alls in bins Definition Mutual Independence (a) The events,, 5 are mutually independent if Pr[ k K k ] = Π k K Pr[ k ], for all K {,,5} (b) More generally, the events { j,j J} are mutually independent if Pr[ k K k ] = Π k K Pr[ k ], for all finitek J Example: Flip a fair coin forever Let n = coin n is H Then the events n are mutually independent Theorem (a) If the events { j,j J} are mutually independent and if K and K 2 are disjoint finite subsets of J, then k K k and k K2 k are independent (b) More generally, if the K n are pairwise disjoint finite subsets of J, then the events k Kn k are mutually independent (c) lso, the same is true if we replace some of the k by Āk Proof: See Notes 25, 27 One throws m balls into n > m bins
4 alls in bins One throws m balls into n > m bins alls in bins }, for large enough n alls in bins }, for large enough n In particular, Pr[no collision] /2 for m 2 /() ln(2), ie, m 2ln(2)n 2 n Eg, Roughly, Pr[collision] /2 for m = n (e 5 6) }, for large enough n The Calculation i = no collision when ith ball is placed in a bin Pr[ i i ] = ( i n ) no collision = m Product rule: Pr[ m ] = Pr[ ]Pr[ 2 ] Pr[ m m ] ( Pr[no collision] = ) ( m ) n n ln(pr[no collision]) = m k= = n ( ) We used ln( ε) ε for ε ( ) m = (m )m/2 ln( k m n ) ( k n ) ( ) k= ( ) m(m ) m2 2 pproximation exp{ x} = x + 2! x 2 + x, for x x ln( x) for x Today s your birthday, it s my birthday too Probability that m people all have different birthdays? With n = 365, one finds Pr[collision] /2 if m If m = 6, we find that 62 } = exp{ } 7 If m = 366, then Pr[no collision] = (No approximation here!)
5 Checksums! Coupon Collector Problem Coupon Collector Problem: nalysis Consider a set of m files Each file has a checksum of b bits How large should b be for Pr[share a checksum] 3? Claim: b 29ln(m) + 9 Proof: Let n = 2 b be the number of checksums We know Pr[no collision] exp{ m 2 /()} m 2 /() Pr[no collision] 3 m 2 /() 3 m b+ m 2 2 b + + 2log 2 (m) + 29ln(m) Note: log 2 (x) = log 2 (e)ln(x) 44ln(x) There are n different baseball cards (rian Wilson, Jackie Robinson, Roger Hornsby, ) One random baseball card in each cereal box If you buy m boxes, (a) Pr[miss one specific item] e m n (b) Pr[miss any one of the items] ne m n Event m = fail to get rian Wilson in m cereal boxes Fail the first time: ( n ) Fail the second time: ( n ) nd so on for m times Pr[ m ] = ( n ) ( n ) = ( n )m ln(pr[ m ]) = m ln( n ) m ( n ) Pr[ m ] exp{ m n } For p m = 2, we need around n ln2 69n boxes Collect all cards? Collect all cards? Summary Experiment: Choose m cards at random with replacement Events: E k = fail to get player k, for k =,, n Probability of failing to get at least one of these n players: p := Pr[E E 2 E n ] How does one estimate p? Union ound: p = Pr[E E 2 E n ] Pr[E ] + Pr[E 2 ] Pr[E n ] Pr[E k ] e m n,k =,,n Plug in and get p ne m n Thus, Pr[missing at least one card] ne m n Pr[missing at least one card] p when m n ln( n p ) To get p = /2, set m = n ln() Eg, n = 2 m = 53;n = 3 m = 76 ayes Rule, Mutual Independence, Collisions and Collecting Main results: ayes Rule: Pr[ m ] = p m q m /(p q + + p M q M ) Product Rule: Pr[ n ] = Pr[ ]Pr[ 2 ] Pr[ n n ] alls in bins: m balls into n > m bins Pr[no collisions] exp{ m2 } Coupon Collection: n items uy m cereal boxes Pr[miss one specific item] e m n ; Pr[miss any one of the items] ne m n Key Mathematical Fact: ln( ε) ε
Pr[A B] > Pr[A]Pr[B]. Pr[A B C] = Pr[(A B) C] = Pr[A]Pr[B A]Pr[C A B].
CS70: Jean Walrand: Lecture 25. Product Rule Product Rule Causality, Independence, Collisions and Collecting 1. Product Rule 2. Correlation and Causality 3. Independence 4. 5. Birthdays 6. Checksums 7.
More informationProbability Basics Review
CS70: Jean Walrand: Lecture 16 Events, Conditional Probability, Independence, Bayes Rule Probability Basics Review Setup: Set notation review A B A [ B A \ B 1 Probability Basics Review 2 Events 3 Conditional
More informationCS70: Jean Walrand: Lecture 16.
CS70: Jean Walrand: Lecture 16. Events, Conditional Probability, Independence, Bayes Rule 1. Probability Basics Review 2. Events 3. Conditional Probability 4. Independence of Events 5. Bayes Rule Probability
More informationLecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya
BBM 205 Discrete Mathematics Hacettepe University http://web.cs.hacettepe.edu.tr/ bbm205 Lecture 9b: Events, Conditional Probability, Independence, Bayes' Rule Lecturer: Lale Özkahya Resources: Kenneth
More informationContinuing Probability.
Continuing Probability. Wrap up: Probability Formalism. Events, Conditional Probability, Independence, Bayes Rule Probability Space: Formalism Simplest physical model of a uniform probability space: Red
More informationProbability Space: Formalism Simplest physical model of a uniform probability space:
Lecture 16: Continuing Probability Probability Space: Formalism Simplest physical model of a uniform probability space: Probability Space: Formalism Simplest physical model of a non-uniform probability
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten
More informationMathematical Foundations of Computer Science Lecture Outline October 18, 2018
Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or
More information1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).
CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,
More informationLecture 04: Balls and Bins: Birthday Paradox. Birthday Paradox
Lecture 04: Balls and Bins: Overview In today s lecture we will start our study of balls-and-bins problems We shall consider a fundamental problem known as the Recall: Inequalities I Lemma Before we begin,
More informationSets. Review of basic probability. Tuples. E = {all even integers} S = {x E : x is a multiple of 3} CSE 101. I = [0, 1] = {x : 0 x 1}
Sets A = {a, b, c,..., z} A = 26 Review of basic probability = {0, 1} = 2 E = {all even integers} E = CSE 1 S = {x E : x is a multiple of 3} I = [0, 1] = {x : 0 x 1} In a set, the order of elements doesn
More informationTopic 2 Multiple events, conditioning, and independence, I. 2.1 Two or more events on the same sample space
CSE 103: Probability and statistics Fall 2010 Topic 2 Multiple events, conditioning, and independence, I 2.1 Two or more events on the same sample space Frequently we end up dealing with multiple events
More information6.1 Occupancy Problem
15-859(M): Randomized Algorithms Lecturer: Anupam Gupta Topic: Occupancy Problems and Hashing Date: Sep 9 Scribe: Runting Shi 6.1 Occupancy Problem Bins and Balls Throw n balls into n bins at random. 1.
More informationThe first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have
The first bound is the strongest, the other two bounds are often easier to state and compute Proof: Applying Markov's inequality, for any >0 we have Pr (1 + ) = Pr For any >0, we can set = ln 1+ (4.4.1):
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as
More informationBalls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random
Balls & Bins Put m balls into n bins uniformly at random Seoul National University 1 2 3 n Balls into Bins Name: Chong kwon Kim Same (or similar) problems Birthday paradox Hash table Coupon collection
More informationELEG 3143 Probability & Stochastic Process Ch. 1 Probability
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random
More informationTheorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )
Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr Pr = Pr Pr Pr() Pr Pr. We are given three coins and are told that two of the coins are fair and the
More informationLecture Notes. Here are some handy facts about the probability of various combinations of sets:
Massachusetts Institute of Technology Lecture 20 6.042J/18.062J: Mathematics for Computer Science April 20, 2000 Professors David Karger and Nancy Lynch Lecture Notes 1 Set Theory and Probability 1.1 Basic
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationLecture 4: Two-point Sampling, Coupon Collector s problem
Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms
More informationLecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun
1 Probabilistic Method Lecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun Turning the MaxCut proof into an algorithm. { Las Vegas Algorithm Algorithm
More informationExpectation, inequalities and laws of large numbers
Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator
More informationTopic 3 Random variables, expectation, and variance, II
CSE 103: Probability and statistics Fall 2010 Topic 3 Random variables, expectation, and variance, II 3.1 Linearity of expectation If you double each value of X, then you also double its average; that
More informationLecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability
Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic
More informationIndependence. Conditional Probability. Conditional Probability. Independence. SCONE Lab. Definition: Two events E and F are independent iff
Independence Definition: Two events E and F are independent iff Seoul National University Example Pr(E F) = Pr(E) Pr(F) Flip a coin twice Event E: First trial is heads Event F: Second trial is tails Pr(E)
More informationP (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).
Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,
More informationNotes slides from before lecture. CSE 21, Winter 2017, Section A00. Lecture 16 Notes. Class URL:
Notes slides from before lecture CSE 21, Winter 2017, Section A00 Lecture 16 Notes Class URL: http://vlsicad.ucsd.edu/courses/cse21-w17/ Notes slides from before lecture Notes March 8 (1) This week: Days
More information= 2 5 Note how we need to be somewhat careful with how we define the total number of outcomes in b) and d). We will return to this later.
PROBABILITY MATH CIRCLE (ADVANCED /27/203 The likelyhood of something (usually called an event happening is called the probability. Probability (informal: We can calculate probability using a ratio: want
More informationConditional Probability
Chapter 3 Conditional Probability 3.1 Definition of conditional probability In spite of our misgivings, let us persist with the frequency definition of probability. Consider an experiment conducted N times
More informationMathematical Foundations of Computer Science Lecture Outline October 9, 2018
Mathematical Foundations of Computer Science Lecture Outline October 9, 2018 Eample. in 4? When three dice are rolled what is the probability that one of the dice results Let F i, i {1, 2, 3} be the event
More informationConditional Probability
Conditional Probability When we obtain additional information about a probability experiment, we want to use the additional information to reassess the probabilities of events given the new information.
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand JEAN WALRAND - Probability Review
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand JEAN WALRAND - Probability The objective of these notes is to enable you to check your understanding and knowledge of the probability
More information1 Probability Review. CS 124 Section #8 Hashing, Skip Lists 3/20/17. Expectation (weighted average): the expectation of a random quantity X is:
CS 24 Section #8 Hashing, Skip Lists 3/20/7 Probability Review Expectation (weighted average): the expectation of a random quantity X is: x= x P (X = x) For each value x that X can take on, we look at
More informationDiscrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations
EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of
More informationWith Question/Answer Animations. Chapter 7
With Question/Answer Animations Chapter 7 Chapter Summary Introduction to Discrete Probability Probability Theory Bayes Theorem Section 7.1 Section Summary Finite Probability Probabilities of Complements
More informationRandomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationMA : Introductory Probability
MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:
More informationIntroduction to discrete probability. The rules Sample space (finite except for one example)
Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }
More informationAditya Bhaskara CS 5968/6968, Lecture 1: Introduction and Review 12 January 2016
Lecture 1: Introduction and Review We begin with a short introduction to the course, and logistics. We then survey some basics about approximation algorithms and probability. We also introduce some of
More informationDiscrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20
CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute
More informationProbabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation
EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,
More information2 Chapter 2: Conditional Probability
STAT 421 Lecture Notes 18 2 Chapter 2: Conditional Probability Consider a sample space S and two events A and B. For example, suppose that the equally likely sample space is S = {0, 1, 2,..., 99} and A
More informationCarleton University. Final Examination Fall DURATION: 2 HOURS No. of students: 195
Carleton University Final Examination Fall 15 DURATION: 2 HOURS No. of students: 195 Department Name & Course Number: Computer Science COMP 2804A Course Instructor: Michiel Smid Authorized memoranda: Calculator
More information4. What is the probability that the two values differ by 4 or more in absolute value? There are only six
1. Short Questions: 2/2/2/2/2 Provide a clear and concise justification of your answer. In this problem, you roll two balanced six-sided dice. Hint: Draw a picture. 1. What is the probability that the
More informationDiscrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence
More informationMonty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch
Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,
More informationDiscrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation
CS 70 Discrete Mathematics and Probability Theory Fall 202 Vazirani Note 4 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected in, randomly
More informationOutline. CS38 Introduction to Algorithms. Max-3-SAT approximation algorithm 6/3/2014. randomness in algorithms. Lecture 19 June 3, 2014
Outline CS38 Introduction to Algorithms Lecture 19 June 3, 2014 randomness in algorithms max-3-sat approximation algorithm universal hashing load balancing Course summary and review June 3, 2014 CS38 Lecture
More informationDiscrete Mathematics and Probability Theory Fall 2011 Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Fall 20 Rao Midterm 2 Solutions True/False. [24 pts] Circle one of the provided answers please! No negative points will be assigned for incorrect answers.
More informationELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities Dr. Jing Yang jingyang@uark.edu OUTLINE 2 Applications
More informationReasoning under uncertainty
Reasoning under uncertainty Probability Review II CSC384 March 16, 2018 CSC384 Reasoning under uncertainty March 16, 2018 1 / 22 From last class Axioms of probability Probability over feature vectors Independence
More informationCS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 32
CS 473: Algorithms Ruta Mehta University of Illinois, Urbana-Champaign Spring 2018 Ruta (UIUC) CS473 1 Spring 2018 1 / 32 CS 473: Algorithms, Spring 2018 Universal Hashing Lecture 10 Feb 15, 2018 Most
More information6.042/18.062J Mathematics for Computer Science. Independence
6.042/8.062J Mathematics for Computer Science Srini Devadas and Eric Lehman April 26, 2005 Lecture otes Independence Independent Events Suppose that we flip two fair coins simultaneously on opposite sides
More informationCPSC 467: Cryptography and Computer Security
CPSC 467: Cryptography and Computer Security Michael J. Fischer Lecture 14 October 16, 2013 CPSC 467, Lecture 14 1/45 Message Digest / Cryptographic Hash Functions Hash Function Constructions Extending
More informationCS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis
CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis Eli Upfal Eli Upfal@brown.edu Office: 319 TA s: Lorenzo De Stefani and Sorin Vatasoiu cs155tas@cs.brown.edu It is remarkable
More informationLecture 5: January 30
CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 5: January 30 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They
More informationSolutions to Problem Set 4
UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 010 Solutions to Problem Set 4 1. (MU 5.4 In a lecture hall containing 100 people, you consider whether or not there are three people in
More information1 Preliminaries Sample Space and Events Interpretation of Probability... 13
Summer 2017 UAkron Dept. of Stats [3470 : 461/561] Applied Statistics Ch 2: Probability Contents 1 Preliminaries 3 1.1 Sample Space and Events...........................................................
More informationConditional Probability and Bayes
Conditional Probability and Bayes Chapter 2 Lecture 5 Yiren Ding Shanghai Qibao Dwight High School March 9, 2016 Yiren Ding Conditional Probability and Bayes 1 / 13 Outline 1 Independent Events Definition
More informationCSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II
CSC384: Intro to Artificial Intelligence Reasoning under Uncertainty-II 1 Bayes Rule Example Disease {malaria, cold, flu}; Symptom = fever Must compute Pr(D fever) to prescribe treatment Why not assess
More informationChapter 1: Introduction to Probability Theory
ECE5: Stochastic Signals and Systems Fall 8 Lecture - September 6, 8 Prof. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter : Introduction to Probability Theory Axioms of Probability
More informationAssignment 4: Solutions
Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each
More informationCS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =
CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14 1 Probability First, recall a couple useful facts from last time about probability: Linearity of expectation: E(aX + by ) = ae(x)
More informationACM 116: Lecture 1. Agenda. Philosophy of the Course. Definition of probabilities. Equally likely outcomes. Elements of combinatorics
1 ACM 116: Lecture 1 Agenda Philosophy of the Course Definition of probabilities Equally likely outcomes Elements of combinatorics Conditional probabilities 2 Philosophy of the Course Probability is the
More informationLecture 8: Probability
Lecture 8: Probability The idea of probability is well-known The flipping of a balanced coin can produce one of two outcomes: T (tail) and H (head) and the symmetry between the two outcomes means, of course,
More informationDiscrete Mathematics and Probability Theory Spring 2017 Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Spring 2017 Rao Midterm 2 Solutions PRINT Your Name: Oski Bear SIGN Your Name: OS K I PRINT Your Student ID: CIRCLE your exam room: Pimentel 1 GPB 100
More informationFormalizing Probability. Choosing the Sample Space. Probability Measures
Formalizing Probability Choosing the Sample Space What do we assign probability to? Intuitively, we assign them to possible events (things that might happen, outcomes of an experiment) Formally, we take
More informationDynamic Programming Lecture #4
Dynamic Programming Lecture #4 Outline: Probability Review Probability space Conditional probability Total probability Bayes rule Independent events Conditional independence Mutual independence Probability
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationDiscrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10
EECS 70 Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10 Introduction to Basic Discrete Probability In the last note we considered the probabilistic experiment where we flipped
More informationThird Problem Assignment
EECS 401 Due on January 26, 2007 PROBLEM 1 (35 points Oscar has lost his dog in either forest A (with a priori probability 0.4 or in forest B (with a priori probability 0.6. If the dog is alive and not
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =
More informationDecision Trees. Nicholas Ruozzi University of Texas at Dallas. Based on the slides of Vibhav Gogate and David Sontag
Decision Trees Nicholas Ruozzi University of Texas at Dallas Based on the slides of Vibhav Gogate and David Sontag Supervised Learning Input: labelled training data i.e., data plus desired output Assumption:
More informationChase Joyner. 901 Homework 1. September 15, 2017
Chase Joyner 901 Homework 1 September 15, 2017 Problem 7 Suppose there are different types of coupons available when buying cereal; each box contains one coupon and the collector is seeking to collect
More informationProbability reminders
CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so
More informationMath 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =
Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,
More informationConditional Probability & Independence. Conditional Probabilities
Conditional Probability & Independence Conditional Probabilities Question: How should we modify P(E) if we learn that event F has occurred? Definition: the conditional probability of E given F is P(E F
More informationMS-A0504 First course in probability and statistics
MS-A0504 First course in probability and statistics Heikki Seppälä Department of Mathematics and System Analysis School of Science Aalto University Spring 2016 Probability is a field of mathematics, which
More informationECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview
ECE 450 Lecture 2 Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview Conditional Probability, Pr(A B) Total Probability Bayes Theorem Independent Events
More informationDiscrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions
CS 70 Discrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Midterm 2 Solutions PRINT Your Name: Oski Bear SIGN Your Name: OS K I PRINT Your Student ID: CIRCLE your exam room: Pimentel
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationSTAT Chapter 3: Probability
Basic Definitions STAT 515 --- Chapter 3: Probability Experiment: A process which leads to a single outcome (called a sample point) that cannot be predicted with certainty. Sample Space (of an experiment):
More informationMARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM
MARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM XIANG LI Abstract. This paper centers on the analysis of a specific randomized algorithm, a basic random process that involves marking
More informationCS70: Jean Walrand: Lecture 19.
CS70: Jean Walrand: Lecture 19. Random Variables: Expectation 1. Random Variables: Brief Review 2. Expectation 3. Important Distributions Random Variables: Definitions Definition A random variable, X,
More informationLECTURE 1. 1 Introduction. 1.1 Sample spaces and events
LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events
More informationJanson s Inequality and Poisson Heuristic
Janson s Inequality and Poisson Heuristic Dinesh K CS11M019 IIT Madras April 30, 2012 Dinesh (IITM) Janson s Inequality April 30, 2012 1 / 11 Outline 1 Motivation Dinesh (IITM) Janson s Inequality April
More informationCh 2: Probability. Contents. Fall 2017 UAkron Dept. of Stats [3470 : 451/551] Theoretical Statistics I
Fall 2017 UAkron Dept. of Stats [3470 : 451/551] Theoretical Statistics I Ch 2: Probability Contents 1 Preliminaries 3 1.1 Interpretation of Probability (2.2)......................................................
More informationCMPSCI 240: Reasoning about Uncertainty
CMPSCI 240: Reasoning about Uncertainty Lecture 2: Sets and Events Andrew McGregor University of Massachusetts Last Compiled: January 27, 2017 Outline 1 Recap 2 Experiments and Events 3 Probabilistic Models
More informationChapter 2: Probability Part 1
Engineering Probability & Statistics (AGE 1150) Chapter 2: Probability Part 1 Dr. O. Phillips Agboola Sample Space (S) Experiment: is some procedure (or process) that we do and it results in an outcome.
More informationEECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.
EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed
More informationDiscrete Mathematics for CS Spring 2006 Vazirani Lecture 22
CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 Random Variables and Expectation Question: The homeworks of 20 students are collected in, randomly shuffled and returned to the students.
More informationLecture 2. Binomial and Poisson Probability Distributions
Durkin, Lecture 2, Page of 6 Lecture 2 Binomial and Poisson Probability Distributions ) Bernoulli Distribution or Binomial Distribution: Consider a situation where there are only two possible outcomes
More informationLectures Conditional Probability and Independence
Lectures 5 11 Conditional Probability and Independence Purpose: Calculate probabilities under restrictions, conditions or partial information on the random experiment. Break down complex probabilistic
More informationConditional Probability P( )
Conditional Probability P( ) 1 conditional probability where P(F) > 0 Conditional probability of E given F: probability that E occurs given that F has occurred. Conditioning on F Written as P(E F) Means
More informationOverview. CSE 21 Day 5. Image/Coimage. Monotonic Lists. Functions Probabilistic analysis
Day 5 Functions/Probability Overview Functions Probabilistic analysis Neil Rhodes UC San Diego Image/Coimage The image of f is the set of values f actually takes on (a subset of the codomain) The inverse
More informationUC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes
UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 26: Probability and Random Processes Problem Set Spring 209 Self-Graded Scores Due:.59 PM, Monday, February 4, 209 Submit your
More informationCS70: Jean Walrand: Lecture 22.
CS70: Jean Walrand: Lecture 22. Confidence Intervals; Linear Regression 1. Review 2. Confidence Intervals 3. Motivation for LR 4. History of LR 5. Linear Regression 6. Derivation 7. More examples Review:
More informationSolution Set for Homework #1
CS 683 Spring 07 Learning, Games, and Electronic Markets Solution Set for Homework #1 1. Suppose x and y are real numbers and x > y. Prove that e x > ex e y x y > e y. Solution: Let f(s = e s. By the mean
More informationIn a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers.
Suppose you are a content delivery network. In a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers. How to distribute requests to balance the
More information